I seem to get scoffed at when I mention how important this is. So I thought I'd do some math and get some actual figures on paper, then electronic papel to show everyone my points have validity.
Before I get into the real math and figures, I will say that in a year's time, the differences in costs are pretty substantial. Substantial like the fact that we critically argue how crucial initial costs of CPUs and GPUs per their performance marks, but when power efficiency comes into play we seem to forget operational cost per performance. If the video card you just bought for $30 cheaper performs even on par with another, but is significantly inefficient with its watt usage and heat generation in comparison and you'll spend that $30 you saved over the next 6 months - then more in the following months - wouldn't that be something you'd consider when originally purchasing it? Don't think energy costs that much? I dare you to continue reading.
Based off of reading an energy costs article from Anandtech from late 2008, I figured the math for a flat rate kWh consumption (http://www.anandtech.com/show/2668/2). Now, I will remind you that running a computer not only takes power, but also creates additional heat, which increases the demand on your A/C. I will not be exploring that aspect of cost factoring in this post, but I will provide you with a difference in my energy bill usage and costs for the last 4 months. This will provide you a good real-world example of what more heat and energy usage can do to my actual energy bill.
I live in Georgia, USA for those that are curious of rates. For those that don't know, the climate here is usually hot and humid (temperatures easily reaching 100+ with 90% humidity in summer months); a heavy demand on any A/C unit. Here, the pricing is tiered for the kWh you use and if it's during summer or winter months. For example: the first 650 kWh is charged at a rate of $0.045991, the next 350 kWh is at the rate of $0.03946, and if you go beyond 1000 kWh the rate is $0.038737 for winter. Summer months the rates increase as you reach the next tier - up to $0.078765. http://www.georgiapower.com/pricing/pdf/2.10_R-16.pdf Below are the costs exactly as it's coming out of my pocket. I was going to take out the fees and taxes, but then decided it wouldn't matter since every energy company charges this, and fees & taxes of the actual services provided is inherent to business.
June 2010 - 1,438 kWh - $176.48
July 2010 - 1,004 kWh - $126.58
August 2010 - 834 kWh - $102.70
September 2010 - 722 kWh - $80.44
Note that all four bills are in the summer charge rates. The only variances here are the heat outside and the fact that I got my brother to turn off his computer when he wasn't using it sometime in early-mid August (only reflecting in Septembers bill at this point). I'm certain my next month's bill will be even lower since it's cooler and my brother is no longer living with me. I always turn my computer off when I'm not using it. It's on for about 8 hours a day; 4-6 hours at full load (gaming) and 2-4 hours idle'ish.
Due to the way energy exchange works, 100% efficient PSUs can't possibly exist, and 82% is an industry standard. For availability, an example would be taken by looking at what Newegg has. Out of a total of 250 listed PSUs, 189 are rated below 80+, at 80+ or Bronze certified (82%+). Silver certified is where 85%+ is necessary. I know most custom builders probably use PSUs that are 500-800W and if building for initial cost effectiveness, I doubt the majority will spend more than $75 on a PSU. That being said, there are only a handful of PSUs listed at or under $75 that are Bronze certified, much less 80+ certified. So if you spent even less on an energy inefficient PSU to save your initial cost, chances are you can basically take all the numbers I'm providing here and multiply them by the % difference. Just like living in an area where energy costs are doubled (such as CA), this can make a HUGE difference.
Now onto the figures based on Anadtech's article. I based my initial figures off the charge rate of $0.07/kWh from NC (which happens to be pretty close to what GA charges as well). To be fair to PSU efficiency and actual watt usage, an 82% energy efficient standard PSU watt amount is provided in paranthesis. The final costs are based on these amounts. You can modify the final amounts by difference of energy cost ($0.07/kWh = 100%) or percentages away from a Bronze certified PSU (82%+ = 100%; scaled opposite).
Example #1 - Inefficient gaming rig
-at idle, will consume 310 watts (378)
-under full load, will consume 550 watts (671)
#1-This computer ran for 10 hours a day (8 hours full load, 2 hours at idle): $167.65/year; $13.97/month
#2-This computer ran for 24 hours a day (8 hours full load, 16 hours at idle): $312.57/year; $26.05/month
Example #2 - Efficient gaming rig
-at idle, will consume 160 watts (195)
-under full load, will consume 350 watts (427)
#3-This computer ran for 10 hours a day (8 hours full load, 2 hours at idle): $104.19/year; $8.68/month
#4-This computer ran for 24 hours a day (8 hours full load, 16 hours at idle): $178.93/year; $14.91/month
In both examples, you can see that by simply leaving your computer on increases costs dramatically. These costs do not reflect your monitor, only your case and what's inside.
As an extreme Example #3 for California costs and a less efficient PSU (75% instead of 82%) (example numbers used from above):
#1-$366.48/year; $30.54/month
#2-$683.28/year; $56.93/month
#3-$227.76/year; $18.99/month
#4-$391.14/year; $32.59/month
So what are the actual differences between an extremely inefficient computer and one that is more efficient as well as someone not utilizing simple efficient power practices? Humungous! A difference of $34/month. This does not include the additional heat generated either. So operational costs can easily be $40/month between extremes in California, and $20/month between extremes in the southeast (NC and GA at least). I'll let you figure out the exact amounts, but in the 2-4 years you own that component of your computer, you could saved enough money in operational costs to upgrade again -or- purchase a higher quality product in 1/3rd to 1/8th the time.
Fortunately, not only for our wallets, but for Mother Earth as well, both AMD and Nvidia seem to have graciously adopted the importance of energy efficiency. The new 6800 series seems to only improve on the 5800's efficiency, and the GTX460 is a huge improvement over the 470 and 480. As dies get smaller, CPUs become more clustered and lower in frequency - less heat and watts used overall. Things are looking much better. However, ultimately, the choice is still yours.
Enlightened, you shall be.
Before I get into the real math and figures, I will say that in a year's time, the differences in costs are pretty substantial. Substantial like the fact that we critically argue how crucial initial costs of CPUs and GPUs per their performance marks, but when power efficiency comes into play we seem to forget operational cost per performance. If the video card you just bought for $30 cheaper performs even on par with another, but is significantly inefficient with its watt usage and heat generation in comparison and you'll spend that $30 you saved over the next 6 months - then more in the following months - wouldn't that be something you'd consider when originally purchasing it? Don't think energy costs that much? I dare you to continue reading.
Based off of reading an energy costs article from Anandtech from late 2008, I figured the math for a flat rate kWh consumption (http://www.anandtech.com/show/2668/2). Now, I will remind you that running a computer not only takes power, but also creates additional heat, which increases the demand on your A/C. I will not be exploring that aspect of cost factoring in this post, but I will provide you with a difference in my energy bill usage and costs for the last 4 months. This will provide you a good real-world example of what more heat and energy usage can do to my actual energy bill.
I live in Georgia, USA for those that are curious of rates. For those that don't know, the climate here is usually hot and humid (temperatures easily reaching 100+ with 90% humidity in summer months); a heavy demand on any A/C unit. Here, the pricing is tiered for the kWh you use and if it's during summer or winter months. For example: the first 650 kWh is charged at a rate of $0.045991, the next 350 kWh is at the rate of $0.03946, and if you go beyond 1000 kWh the rate is $0.038737 for winter. Summer months the rates increase as you reach the next tier - up to $0.078765. http://www.georgiapower.com/pricing/pdf/2.10_R-16.pdf Below are the costs exactly as it's coming out of my pocket. I was going to take out the fees and taxes, but then decided it wouldn't matter since every energy company charges this, and fees & taxes of the actual services provided is inherent to business.
June 2010 - 1,438 kWh - $176.48
July 2010 - 1,004 kWh - $126.58
August 2010 - 834 kWh - $102.70
September 2010 - 722 kWh - $80.44
Note that all four bills are in the summer charge rates. The only variances here are the heat outside and the fact that I got my brother to turn off his computer when he wasn't using it sometime in early-mid August (only reflecting in Septembers bill at this point). I'm certain my next month's bill will be even lower since it's cooler and my brother is no longer living with me. I always turn my computer off when I'm not using it. It's on for about 8 hours a day; 4-6 hours at full load (gaming) and 2-4 hours idle'ish.
Due to the way energy exchange works, 100% efficient PSUs can't possibly exist, and 82% is an industry standard. For availability, an example would be taken by looking at what Newegg has. Out of a total of 250 listed PSUs, 189 are rated below 80+, at 80+ or Bronze certified (82%+). Silver certified is where 85%+ is necessary. I know most custom builders probably use PSUs that are 500-800W and if building for initial cost effectiveness, I doubt the majority will spend more than $75 on a PSU. That being said, there are only a handful of PSUs listed at or under $75 that are Bronze certified, much less 80+ certified. So if you spent even less on an energy inefficient PSU to save your initial cost, chances are you can basically take all the numbers I'm providing here and multiply them by the % difference. Just like living in an area where energy costs are doubled (such as CA), this can make a HUGE difference.
Now onto the figures based on Anadtech's article. I based my initial figures off the charge rate of $0.07/kWh from NC (which happens to be pretty close to what GA charges as well). To be fair to PSU efficiency and actual watt usage, an 82% energy efficient standard PSU watt amount is provided in paranthesis. The final costs are based on these amounts. You can modify the final amounts by difference of energy cost ($0.07/kWh = 100%) or percentages away from a Bronze certified PSU (82%+ = 100%; scaled opposite).
Example #1 - Inefficient gaming rig
-at idle, will consume 310 watts (378)
-under full load, will consume 550 watts (671)
#1-This computer ran for 10 hours a day (8 hours full load, 2 hours at idle): $167.65/year; $13.97/month
#2-This computer ran for 24 hours a day (8 hours full load, 16 hours at idle): $312.57/year; $26.05/month
Example #2 - Efficient gaming rig
-at idle, will consume 160 watts (195)
-under full load, will consume 350 watts (427)
#3-This computer ran for 10 hours a day (8 hours full load, 2 hours at idle): $104.19/year; $8.68/month
#4-This computer ran for 24 hours a day (8 hours full load, 16 hours at idle): $178.93/year; $14.91/month
In both examples, you can see that by simply leaving your computer on increases costs dramatically. These costs do not reflect your monitor, only your case and what's inside.
As an extreme Example #3 for California costs and a less efficient PSU (75% instead of 82%) (example numbers used from above):
#1-$366.48/year; $30.54/month
#2-$683.28/year; $56.93/month
#3-$227.76/year; $18.99/month
#4-$391.14/year; $32.59/month
So what are the actual differences between an extremely inefficient computer and one that is more efficient as well as someone not utilizing simple efficient power practices? Humungous! A difference of $34/month. This does not include the additional heat generated either. So operational costs can easily be $40/month between extremes in California, and $20/month between extremes in the southeast (NC and GA at least). I'll let you figure out the exact amounts, but in the 2-4 years you own that component of your computer, you could saved enough money in operational costs to upgrade again -or- purchase a higher quality product in 1/3rd to 1/8th the time.
Fortunately, not only for our wallets, but for Mother Earth as well, both AMD and Nvidia seem to have graciously adopted the importance of energy efficiency. The new 6800 series seems to only improve on the 5800's efficiency, and the GTX460 is a huge improvement over the 470 and 480. As dies get smaller, CPUs become more clustered and lower in frequency - less heat and watts used overall. Things are looking much better. However, ultimately, the choice is still yours.
Enlightened, you shall be.