How important is the efficiency of a power supply?

Traciatim

Distinguished
Here is some actual math. Say for instance your machine is going to use 300 watts while playing a game and you game for 4 hours every day.

An 80+ Bronze supply at 85% efficient will use about 352 watts from the wall, 52 watts being wasted as heat.
An 80+ Gold supply at 90% efficient will use about 333 watts from the wall, 33 being wasted as heat.

Over 4 hours that would translate to 208 watt hours and 132 watt hours, or 0.208 kilowatt hours and 0.132 kilowatt hours. If you pay $0.15 per kWh that's about 3.12 cents per day or 1.98 cents per day of waste. Over 4 years of the supply that would be about $45.55 bucks or $28.90 . . . a difference of $16.65.

So if the difference in cost is $16.65 or less then it would make sense financially. Though it also makes less heat in your room and makes your supply easier to cool and generally gold supplies will be made with higher quality components so it might not be a financial only decision. I just wanted to show how much difference there would actually be.
 

onichikun

Distinguished
Nov 13, 2009
304
1
18,860


Well look at it like this, if you have ~10% greater efficiency for an 800W PSU, that saves you about 80W of power that is getting thrown away (in reality this is less, since efficiency fluctuates based on load, and you will probably not run consistently at full load.

If you have your computer on all day (not asleep) you will save about 80*24 Wh per day, which is ~2Kwh per day, which is about 0.10$ per day where I live. That extra ~69$ USD would be paid off in about 2 years :p

So not really that important.
 

Traciatim

Distinguished


That math only works if you have an 800 watt supply and it's supplying that much to the machine 24/7, which is highly unlikely. Very few machine short of a coin miner would ever run like that... and in those cases it generally does make sense to go for the most efficient supplies possible since wasted power is wasted cash.


 

onichikun

Distinguished
Nov 13, 2009
304
1
18,860




I used the extreme case to give an upper bound. Unless you are running server-class hardware 24/7 with redundant PSUs on 3phase, you aren't going to save much with 10% more efficiency.

My lab prefers "greener" hardware, not only to boast about better power numbers, but also because it substantially lowers our power bill (about $1 million per year)