## Posts Tagged ‘power rating’

### How to Read the Power Rating of an Electrical Appliance

3, Oct 2014

It is time to get to know how we can read the power rating of an electrical appliance and how we can use this information in order to calculate a rough estimate on what should be the expected energy usage. Depending on the device and how it operates the actual energy usage it will have may vary, but using the power rating we should at least be able to get a good estimate on the maximum expected energy usage. The best way to learn what is the actual energy usage over a period of time is to use a power meter and/or electricity meter that will measure it accurately. So let us start with an example, on the photo above you can see the label of a power supply of an HP laptop. What we need to look at are the numbers in the lower left corner of the label as these are the power ratings.

The 200W means that the power supply of the laptop has a power rating of 200W, but this is actually the maximum power that the PSU is capable of providing, meaning that the laptop it is connected to will not be actually consuming 200W of power. We can use the number 200 however to calculate the maximum energy usage of the laptop if it is really maxing up what the power supply is capable of providing. So if the laptop runs for 1 hour with a 200W power usage it will have used 0.2 kWh (200 Wh) of energy that we need to pay for, but in reality that number should be lower. Now, if you don’t see a direct number in Watts written on the label of the power supply, but just the Input and Output numbers you should still be able to do the math yourself.

What we are interested in is the Output voltage as the power supply takes the higher Alternating Current (AC) voltage from the electrical network and produces lower Direct Current (DC) that is being used by the laptop. Since the power supply for the laptop is with wide range input (100-240V) to make it compatible with different voltages used worldwide the input values are not going to do us much good. The output of the power supply however is going to work just fine if we take the 19.5 Volt output with 10.3 Ampere, so if we multiply 19.5 * 10.3 we are going to get 200.85 Watts – this is in fact our 200W power rating.

We are using the specific HP laptop as an example, because it has a built-in power meter that will report the real-time energy consumption of the computer based on the current load it is experiencing. You can see that although the power rating of the laptop’s power supply was 200W it normal usage scenario we are using just about 1/4 of that or 53 Watts according to the HP Power Assistant software that measures the current real-time power usage. So if that power usage of 53W remains the same for one hour of us using the laptop we would have just 0.053 kWh of energy consumed as opposed to 0.2 kWh we calculated taking only the power rating of the computer’s power supply. However if we have used a light bulb rated at 100W (pretty bright one) we can accurately use the power rating of 100W of the light bulb to calculate its energy use over a period of time as it maxes out its power rating when turned on and produces light. So things can vary when we have different electrical appliances available, depending on how they actually work and what they do.

There is one more thing when we have electronic devices that use power supply to convert higher voltage AC from the electricity network to lower voltage DC energy that is needed by our computer to function. The conversion that is happening can be made with varying efficiency, meaning that some extra energy is being lost during the conversion in the form of heat and as a result you may notice that the power supply of the laptop starts to get hotter after some usage. The higher the efficiency, the lower the loss of extra power in the form of heat, anything over 80% efficiency is generally considered good and the most efficient solutions can go over 90%. So if we attache a power/energy meter between the power supply and the electricity network where we would normally connect it we can measure the actual power usage. With 53W reported by the laptop that are being used after the conversion, if the power supply has an efficiency of let us say 85% the actual power usage will end up being 60.95W. That is precisely why we need to use a power meter to actually get a right reading on the actual power being used by a device and not rely only on the power rating on its label or some specifications.

### Energy and Power, What is the Difference?

20, Sep 2014

It seems that the two terms energy and power are often confusing for many people and as a result there could be misunderstandings regarding the energy consumed for a given period of time as well as how much energy is consumed by a device. Let us say that you have a device rated at 100W of power required for it to operate, for example a 100W light bulb that will run at this power level constantly in order to give light. But how much energy will the light bulb consume, that depends entirely on the time that you have it turned on it is turning energy to light. If you leave the light bulb running for 1 hour, then it would have consumed 100 Wh of energy and you would need to pay for that energy used. But if you leave the light bulb turned on for just 30 minutes it would have used just 50 Wh, or for 2 hours the energy used will be 200 Wh.

Do you get the difference between energy and power now? The power rating of an item describes how much power the device needs in order to operate, but the energy used by it will depend on the time that you have the device operating. Since we are normally paying per 1 kWh used energy then you can easily do the math for the energy consumption of a device when you know its power rating:

Device power rating * Time in hours the device is operating = Energy used that you need to pay for

So if we turn back to the example with the 100W light bulb we can get:

100W * 0.5 hour = 0.05 kWh (or 50 Wh)
100W * 1 hour = 0.1 kWh (or 100 Wh)
100W * 2 hours = 0.2 kWh (or 200 Wh)

Now if we multiply the result we got for the energy used by the rate for 1 kWh of energy we get what we need to pay for:

0.2 kWh * \$0.15 USD = \$0.03 USD

So that little example in the real world would have cost us just 3 cents if we have a rate of \$0.15 USD per kilowatt hour of energy used.

top