Closed

# How do electric companies calculate the cost?

Tags:
Last response: in CPUs
Share

So my question is: How do electric companies calculate the cost for when the bill people for electricity?

I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.

AMD's processors range from 95-125W, but Intel's range from 73-87W
a b à CPUs

The electricity companies charge a rate per kilowatt-hour (KWh). Kilowatt hours are a a bit tricky to define but imagine a a 1KW panel heater. If this is run for one hour it would generate 1KWh of electricity. Obviously less powerful equipment (like your CPU) would need to be run for longer to generate 1 KWh.

A 100W TDP CPU would need to be run 10 times longer than your panel heater to generate an equivalent 1 KWh.

Your electricity company then charges you as follows:

Price per KWh x number of KWh = Total Cost

Most companies have different prices for peak and off-peak times but this is the gist of it.

Sorry I haven't explained this very well, there are formulas to work out exactly what your equipment is using but its too late in the night to think of stuff like that  !

I would imagine though that calculating it according to the size of your PSU would be easier since it's less number-crunching. You just have the max. power output of your PSU and when you have worked it out it will give a worst-case answer. I guess its always a bonus when you open your bills and they are lower than you expect them to be

Well how much does it cost per KWH for you?
With a CPU that consumes less power, does that not take the PSU to full load therefore saving energy?

Example:
10W~ peripherals
95W GPU
73W vs 95W CPU

650W PSU

178W vs 200W

Also, is 1 joule immediately used in 1 second?
Related resources

Best solution

a c 138 à CPUs

tpho2500 said:
So my question is: How do electric companies calculate the cost for when the bill people for electricity?

I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.

AMD's processors range from 95-125W, but Intel's range from 73-87W

500 watts x 20 hours a week x 52 weeks per year x \$0.10 per kwhr / (1000 watts per kw x 0.85 efficiency x 12 months a year) = \$5.09 per month
a b à CPUs

I'm not exactly sure what my charge per KWh is but I know they are listed on your electricity bills so maybe have a look there for your rate

To be honest, I doubt very much that the TDP of a CPU makes a significant difference when you're talking in terms of years or even months. The best way to reduce your bills would be to use components that require less power hence enabling you to buy a smaller size PSU. Also, a higher efficiency PSU would mean less energy is 'wasted' helping to reduce your bills.

And no, I don't think that necessarily one joule is used per second immediately. They are both two separate measurements. I believe that electricity companies use joules per second or watts as a simpler way to calculate energy consumption and hence cost.
a c 130 à CPUs

The largest difference in wattage saves you (125-73) 52w. 1kwh/52w= 19.2hrs of running time before you would use 1kwh, or about a dimes worth of electricity. (depending on your rate)

No wonder I am dumbfound by people going nuts over green hdd's where you might save 7w total.... LOL

Best answer selected by tpho2500.

moody89 said:
I'm not exactly sure what my charge per KWh is but I know they are listed on your electricity bills so maybe have a look there for your rate

To be honest, I doubt very much that the TDP of a CPU makes a significant difference when you're talking in terms of years or even months. The best way to reduce your bills would be to use components that require less power hence enabling you to buy a smaller size PSU. Also, a higher efficiency PSU would mean less energy is 'wasted' helping to reduce your bills.

And no, I don't think that necessarily one joule is used per second immediately. They are both two separate measurements. I believe that electricity companies use joules per second or watts as a simpler way to calculate energy consumption and hence cost.

This post is flawed in a couple of ways.

1. a power supply is not constantly pulling its highest rated voltage.

2. a higher watt power supply usually pulls power more efficiently when its pulling 50% of its rated voltage, at the least this is true of Corsair power supplies.

My suggestion to you OP is to get one of those socket power meters, plug your computer into it and then it into a socket. Look at your power draw over a week, multiply this by 4, and then this number by your electric companies rate/hour.
a b à CPUs

LOL, a question asked in the exam of junior high school.
a c 80 à CPUs

tpho2500 said:
So my question is: How do electric companies calculate the cost for when the bill people for electricity?

I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.

AMD's processors range from 95-125W, but Intel's range from 73-87W

Your example is not realistic at all due to power-saving features and/or over-clocking.

And if you are gaming the majority of your computer time, your video card selection will have a much greater impact on your power usage for the most part.

a b à CPUs

This topic has been closed by Mousemonkey
Related resources:
!