Should I get a 80+ bronze

gonemtbiking123

Distinguished
Jul 17, 2010
88
0
18,640
So this weekend I am going to finish up a spur of the moment home server build. Running a phenom 9600 I had laying around, just got a am2+ MB, 2 gb of ram, 1 hd, case/psu from old hp computer (2004 ish). I am wondering if I should upgrade the psu to a 80+ bronze maybe http://www.newegg.com/Product/Product.aspx?Item=N82E16817371033 , b/c the server is going to be on 24/7.

I know that the psu from the hp is bad but idk how bad, I am wondering if the 80+ bronze one will pay for itself after a little while. Thanks for the input.
 
Solution
If you intend on 24/7 usage, I would recommend gold the most. Although gold costs more, it'll pay for itself in about a half a year.

Bronze appears to have 82% efficiency from 20 to 100% usage, silver at 85%, and gold at 88%. If your components are using, for example, 300 watts of power, an 82% bronze would require 300/0.82 or 365.9 watts from the grid. The 88% gold would require 300/0.88 or 340.9 watts from the grid. The bulk of the extra is usually heat, bad for computers. The less wasted power there is, the less heat is generated.

It's not much of a difference, but, given this, for every 14 hours the 80 plus bronze certified power supply runs, the gold pays for itself a free hour's worth of electricity. 14 days of 24/7 usage...

ulillillia

Distinguished
Jul 10, 2011
551
0
19,010
If you intend on 24/7 usage, I would recommend gold the most. Although gold costs more, it'll pay for itself in about a half a year.

Bronze appears to have 82% efficiency from 20 to 100% usage, silver at 85%, and gold at 88%. If your components are using, for example, 300 watts of power, an 82% bronze would require 300/0.82 or 365.9 watts from the grid. The 88% gold would require 300/0.88 or 340.9 watts from the grid. The bulk of the extra is usually heat, bad for computers. The less wasted power there is, the less heat is generated.

It's not much of a difference, but, given this, for every 14 hours the 80 plus bronze certified power supply runs, the gold pays for itself a free hour's worth of electricity. 14 days of 24/7 usage means that your 15th day will be free in comparison.

From there, calculate the cost savings with the power supply against the extra cost of a more efficient power supply. If you pay 10 cents per kilowatt-hour, and the more efficient power supply costs $30 more than the bronze, you'll make break even at about 4000 hours (167 days) in. Just something to consider.

Also, efficiency is best around 40 to 60% of the power supply's maximum. That is, if your hardware requires 300 watts, find a power supply from 550 to 700 watts to get the best efficiency possible.
 
Solution

gonemtbiking123

Distinguished
Jul 17, 2010
88
0
18,640


Good point. I think that I will hold off on the psu until I install the new cpu and mb this weekend and see how much power I am pulling at idle and load. I dont imagine that it will be even up to 300 watts, no dvd drive, not hooking up any usb, no mouse/keyboard, no gpu/monitor. But from there I will see what I am dealing with.

To give be an idea of how much I would save does anyone have a guess of how efficiency a old PSU out of a hp would be, 65-70%?
 

ulillillia

Distinguished
Jul 10, 2011
551
0
19,010
A server should only need a few low power things so it may only use 100 watts meaning that a 200-watt power supply is best. I'm only guessing though - servers are a complete unknown to me as far as hardware, operating systems, and software goes, just the obvious basics like what disk space is and the such. Someone who knows servers very well should be who you'd ask. Still, at least you have the basic idea on the math behind the power usage so you can plan your budget accordingly.

I see some power supplies giving as poor as 70% efficiency meaning that 300 watts demanded by the system becomes 428.6 watts off the grid and likely a lot of heat to go with it. I wouldn't know how a 10-year-old power supply would be as far as efficiency goes. The only way to find out is to either search Google or something or, if you have the equipment and scientific know-how, run an experiment.
 

cyberkuberiah

Distinguished
May 5, 2009
812
0
19,010


+1
 

Yep i agree with that rating of the old psu.
Plenty of Bronze psu's operate at Silver ratings in lower loads btw.
 


Not really, a good PSU will give ca. 80% effiency from 20% to 100%. FYI, for example the 80+ Gold requires the PSU to have 87%+ efficiency at full load. So no, that wouldn't be optimal, not in terms of cost and not in terms of efficiency. You want a PSU that's loaded about 20% at idle. That means you shouldn't pick a PSU that's too big.
 

^+1 i agree!