Closed
Solved
How do electric companies calculate the cost?
Anonymous
So my question is: How do electric companies calculate the cost for when the bill people for electricity?
I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.
AMD's processors range from 95125W, but Intel's range from 7387W
I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.
AMD's processors range from 95125W, but Intel's range from 7387W
10
answers
Last reply
Best Answer
More about electric companies calculate cost

The electricity companies charge a rate per kilowatthour (KWh). Kilowatt hours are a a bit tricky to define but imagine a a 1KW panel heater. If this is run for one hour it would generate 1KWh of electricity. Obviously less powerful equipment (like your CPU) would need to be run for longer to generate 1 KWh.
A 100W TDP CPU would need to be run 10 times longer than your panel heater to generate an equivalent 1 KWh.
Your electricity company then charges you as follows:
Price per KWh x number of KWh = Total Cost
Most companies have different prices for peak and offpeak times but this is the gist of it.
Sorry I haven't explained this very well, there are formulas to work out exactly what your equipment is using but its too late in the night to think of stuff like that :P!
I would imagine though that calculating it according to the size of your PSU would be easier since it's less numbercrunching. You just have the max. power output of your PSU and when you have worked it out it will give a worstcase answer. I guess its always a bonus when you open your bills and they are lower than you expect them to be 
Best answer
Anonymous said:So my question is: How do electric companies calculate the cost for when the bill people for electricity?
I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.
AMD's processors range from 95125W, but Intel's range from 7387W
500 watts x 20 hours a week x 52 weeks per year x $0.10 per kwhr / (1000 watts per kw x 0.85 efficiency x 12 months a year) = $5.09 per month 
I'm not exactly sure what my charge per KWh is but I know they are listed on your electricity bills so maybe have a look there for your rate
To be honest, I doubt very much that the TDP of a CPU makes a significant difference when you're talking in terms of years or even months. The best way to reduce your bills would be to use components that require less power hence enabling you to buy a smaller size PSU. Also, a higher efficiency PSU would mean less energy is 'wasted' helping to reduce your bills.
And no, I don't think that necessarily one joule is used per second immediately. They are both two separate measurements. I believe that electricity companies use joules per second or watts as a simpler way to calculate energy consumption and hence cost. 
moody89 said:I'm not exactly sure what my charge per KWh is but I know they are listed on your electricity bills so maybe have a look there for your rate
To be honest, I doubt very much that the TDP of a CPU makes a significant difference when you're talking in terms of years or even months. The best way to reduce your bills would be to use components that require less power hence enabling you to buy a smaller size PSU. Also, a higher efficiency PSU would mean less energy is 'wasted' helping to reduce your bills.
And no, I don't think that necessarily one joule is used per second immediately. They are both two separate measurements. I believe that electricity companies use joules per second or watts as a simpler way to calculate energy consumption and hence cost.
This post is flawed in a couple of ways.
1. a power supply is not constantly pulling its highest rated voltage.
2. a higher watt power supply usually pulls power more efficiently when its pulling 50% of its rated voltage, at the least this is true of Corsair power supplies.
My suggestion to you OP is to get one of those socket power meters, plug your computer into it and then it into a socket. Look at your power draw over a week, multiply this by 4, and then this number by your electric companies rate/hour. 
Anonymous said:So my question is: How do electric companies calculate the cost for when the bill people for electricity?
I understand that 1 watt = 1 joule/second; but a joule is based off of other factors.
I'm trying to figure out if the amount of power of a cpu saved actually makes a difference when paying for electricity.
AMD's processors range from 95125W, but Intel's range from 7387W
Your example is not realistic at all due to powersaving features and/or overclocking.
And if you are gaming the majority of your computer time, your video card selection will have a much greater impact on your power usage for the most part.
Related Resources
Ask a new question
Read More
CPUs
Product
Related Resources
 How much does have cost to make a CPU?
 Consider cost of wattage
 CPU power and cost
 Low Cost CPU
 Estimated CPU production cost
 Question about cpu wattage and electric bill
 How to calculate?
 Best low cost setup for a new system
 MidRange : AMD vs Intel
 Cost Efficiency with AMD...
 How can i calculate what my computer costs me?
 Calculate cost of computer
 How do you calculate the percentage?
 Intel Sees Future in Wind Power, Electric Cars
 Calculating electrical power/cost usage of my computer