Sign in with
Sign up | Sign in
Your question
Closed

How do electric companies calculate the cost?

Last response: in CPUs
Share
March 2, 2010 8:53:58 PM

So my question is: How do electric companies calculate the cost for when the bill people for electricity?

I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.

AMD's processors range from 95-125W, but Intel's range from 73-87W
a b à CPUs
March 2, 2010 9:06:03 PM

The electricity companies charge a rate per kilowatt-hour (KWh). Kilowatt hours are a a bit tricky to define but imagine a a 1KW panel heater. If this is run for one hour it would generate 1KWh of electricity. Obviously less powerful equipment (like your CPU) would need to be run for longer to generate 1 KWh.

A 100W TDP CPU would need to be run 10 times longer than your panel heater to generate an equivalent 1 KWh.

Your electricity company then charges you as follows:

Price per KWh x number of KWh = Total Cost

Most companies have different prices for peak and off-peak times but this is the gist of it.

Sorry I haven't explained this very well, there are formulas to work out exactly what your equipment is using but its too late in the night to think of stuff like that :p !

I would imagine though that calculating it according to the size of your PSU would be easier since it's less number-crunching. You just have the max. power output of your PSU and when you have worked it out it will give a worst-case answer. I guess its always a bonus when you open your bills and they are lower than you expect them to be ;) 



Score
0
March 2, 2010 9:11:22 PM

Well how much does it cost per KWH for you?
With a CPU that consumes less power, does that not take the PSU to full load therefore saving energy?

Example:
10W~ peripherals
95W GPU
73W vs 95W CPU

650W PSU

178W vs 200W

Also, is 1 joule immediately used in 1 second?
Score
0
Related resources

Best solution

a c 229 à CPUs
March 2, 2010 9:13:05 PM

tpho2500 said:
So my question is: How do electric companies calculate the cost for when the bill people for electricity?

I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.

AMD's processors range from 95-125W, but Intel's range from 73-87W


500 watts x 20 hours a week x 52 weeks per year x $0.10 per kwhr / (1000 watts per kw x 0.85 efficiency x 12 months a year) = $5.09 per month
Share
a b à CPUs
March 2, 2010 9:23:57 PM

I'm not exactly sure what my charge per KWh is but I know they are listed on your electricity bills so maybe have a look there for your rate :) 

To be honest, I doubt very much that the TDP of a CPU makes a significant difference when you're talking in terms of years or even months. The best way to reduce your bills would be to use components that require less power hence enabling you to buy a smaller size PSU. Also, a higher efficiency PSU would mean less energy is 'wasted' helping to reduce your bills.

And no, I don't think that necessarily one joule is used per second immediately. They are both two separate measurements. I believe that electricity companies use joules per second or watts as a simpler way to calculate energy consumption and hence cost.
Score
0
a c 149 à CPUs
March 2, 2010 11:33:32 PM

The largest difference in wattage saves you (125-73) 52w. 1kwh/52w= 19.2hrs of running time before you would use 1kwh, or about a dimes worth of electricity. (depending on your rate)

No wonder I am dumbfound by people going nuts over green hdd's where you might save 7w total.... LOL
Score
0
March 12, 2010 11:04:22 PM

Best answer selected by tpho2500.
Score
0
March 13, 2010 12:59:20 AM

moody89 said:
I'm not exactly sure what my charge per KWh is but I know they are listed on your electricity bills so maybe have a look there for your rate :) 

To be honest, I doubt very much that the TDP of a CPU makes a significant difference when you're talking in terms of years or even months. The best way to reduce your bills would be to use components that require less power hence enabling you to buy a smaller size PSU. Also, a higher efficiency PSU would mean less energy is 'wasted' helping to reduce your bills.

And no, I don't think that necessarily one joule is used per second immediately. They are both two separate measurements. I believe that electricity companies use joules per second or watts as a simpler way to calculate energy consumption and hence cost.



This post is flawed in a couple of ways.

1. a power supply is not constantly pulling its highest rated voltage.

2. a higher watt power supply usually pulls power more efficiently when its pulling 50% of its rated voltage, at the least this is true of Corsair power supplies.

My suggestion to you OP is to get one of those socket power meters, plug your computer into it and then it into a socket. Look at your power draw over a week, multiply this by 4, and then this number by your electric companies rate/hour.
Score
0
a b à CPUs
March 13, 2010 1:14:11 AM

LOL, a question asked in the exam of junior high school.
Score
0
a c 117 à CPUs
August 16, 2011 1:32:35 PM

tpho2500 said:
So my question is: How do electric companies calculate the cost for when the bill people for electricity?

I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.

AMD's processors range from 95-125W, but Intel's range from 73-87W



Your example is not realistic at all due to power-saving features and/or over-clocking.

And if you are gaming the majority of your computer time, your video card selection will have a much greater impact on your power usage for the most part.


Score
0
a b à CPUs
August 16, 2011 2:22:46 PM

This topic has been closed by Mousemonkey
Score
0
!