NaShOrN

Distinguished
Dec 26, 2006
31
0
18,530
I've got my PC up and running, but I'm soon to switch power supplies. (running with a 500W one that is just doing the job). As I can't afford to spend much money, I'm looking for the most efficient PSU I can find. I found this nifty PSU that draws just a tad more than 1600W to output 900W, which means around 55% efficiency (best I could find, most are around 35-40%). As this one is just a little more expensive than the 700W version of it, which also does 55%, I'm wondering whether the efficiency of the PSU increases the further away it is from it's max output (like, putting out 400W it will draw 600W instead of the 727W it would if it was making 55%). So, does PSU efficiency decreases as it gives more power, or it is the same no matter how much power you're drawing of it?
 

NaShOrN

Distinguished
Dec 26, 2006
31
0
18,530
Well. A PSU that works at 115V and needs 8A draws 920W. If it outputs 500W drawing 920W that means roughly 55% efficiency, right?
 

cb62fcni

Distinguished
Jul 15, 2006
921
0
18,980
It's not that simple actually. A 500W PSU that's outputting at 500W is going to be less efficient than if it were outputting 400W. I.E., you may need 675W to output 500, whereas at 400W you only need 475W. (I pulled these numbers from my rear- demonstration purposes only!) Temperature also is a variable, the higher temps rise, the lower your efficiency gets. Most PSU's from reputable manufacturers get between 75-85% efficiency below 40C at 80-90% of their max output.
 

NaShOrN

Distinguished
Dec 26, 2006
31
0
18,530
Thanks alot for the help, guys. I know what realy matters is whether the PSU is reliable, but I'm not up to paying huge eletrical bills because my PSU will convert most of the energy drawn from the wall into heat. I'm not very fond of wasting money.
 
Be aware that a PSU's listed efficiency rating is the peak value, not a constant value. When I was researching for PSUs back in 2005, most premium quality PSUs were at their most efficient when the load placed on the PSU was between 70% - 90%. They are at thier most inefficient when the load placed in the PSU was low. Therefore, it is possible that a 1000w PSU will be less efficient than a 500w PSU if all the components drew only 350w.

Based on some extensive research I decided to by the Seasonic S12 500 PSU and is rated at 80%+ efficient. Based on tests done at SilentPCReview.com, the efficiency ranges between 74% at low power consumption to 79% at maximum load. The Seasonic S12 500 hits 83% efficiency when the load placed it is somewhere in between a 70% load to 95% load.

If you are concerned about how much power your PC uses, then you should look at your components. The CPU and GPU are the two most power hungry components in your system. A powerful GPU will draw more watts than a powerful CPU.

Generally speaking Intel's Core 2 Duo CPUs uses less watts than a comparable AMD Athlon AM2 64 X2 CPU. Regarding GPUs, nVidia's video cards uses less power than ATI video cards. The exceptions are the 8800GTS and GTX, they are the most power hungry cards so far.

Below are links to Xbitlabs.com articles about CPU and GPU power consumption:

Contemporary Dual-Core Desktop Processors Shootout

AMD’s Response to Intel Conroe: Energy Efficient Athlon 64 X2 CPU

Faster, Quieter, Lower: Power Consumption and Noise Level of Contemporary Graphics Cards

Directly Unified: Nvidia GeForce 8800 Architecture Review
 
You can also keep items that are not in use un plugged. Most electronics now pull electricity even when they are off. For some they end up pulling more power while they are not in use then they do for the short time they are being used.

I believe I read somewhere that an electronic device (like a stereo or TV) still uses around 40% of electricity when it is in standby mode compare to when it is on.

For example, say your TV uses 150w of power when it is on. If you watch TV for an hour then the TV will be using 150w of power.

For the remaining 23 hours the TV is in standby mode, thus it uses 40% of the power compared to when the TV is on. That works out to 60w (150w x 0.4) per hour. Since the TV is in standby mode for 23 hours it will consume 1,380w.
 

sailer

Splendid
You can also keep items that are not in use un plugged. Most electronics now pull electricity even when they are off. For some they end up pulling more power while they are not in use then they do for the short time they are being used.

I believe I read somewhere that an electronic device (like a stereo or TV) still uses around 40% of electricity when it is in standby mode compare to when it is on.

For example, say your TV uses 150w of power when it is on. If you watch TV for an hour then the TV will be using 150w of power.

For the remaining 23 hours the TV is in standby mode, thus it uses 40% of the power compared to when the TV is on. That works out to 60w (150w x 0.4) per hour. Since the TV is in standby mode for 23 hours it will consume 1,380w.

This is exactly the reason that I put my TV, stereo, VCR, etc. on multi-plug surge protectors so I could hit the switch and turn them all completely on or off, so there's no wasted electricity for those hours I don't use them. Don't know what that will do for their life expectency, but the TV's about 16 years old anyway and I've been thinking of eventually replacing it.
 

cb62fcni

Distinguished
Jul 15, 2006
921
0
18,980
Yea, I've heard the same thing, but I've never seen any conclusive proof in the form of testing. Someone really should go around with an ammeter and measure some home appliances. It's criminal, really, for something that's turned off to continue to consume power. It's just sloppy design on the part of the manufacturer, or should I say sloppy engineering on the part of whoever designed the AC/DC converters.
 

cb62fcni

Distinguished
Jul 15, 2006
921
0
18,980
If anything, I would expect it to actually prolong the life rather than shorten it. Maybe my logic is off, but if a TV is consuming 60W of power in standby mode, then that power is going somewhere. Heat, electron migration, etc. As long as the input lines and converters are decently designed applying power to them isn't going to "shock" them.
 

sailer

Splendid
Its not really criminal. People were wanting instant on TVs, radios, and the like, so the electronics industry accomidated them. The easiest way to get instant on is to never completely power down the appliance.
 

cb62fcni

Distinguished
Jul 15, 2006
921
0
18,980
I don't know about that. While I certainly concede that it's part of the problem, the electronics firms should be more forthright about power consumption in standby mode.
 

dragonsprayer

Splendid
Jan 3, 2007
3,809
0
22,780
Well. A PSU that works at 115V and needs 8A draws 920W. If it outputs 500W drawing 920W that means roughly 55% efficiency, right?

no thats wrong - you probably adding all the totals but they over lap. The effeciecys are not calulated like that at all.
 

sailer

Splendid
Keep in mind that the idea of "instant on" started about 40 years ago. Power was cheap, TVs still used tubes and available options for making something that could be instantly on were very few. Nobody cared if they burned a few more watts of electricity as long as the TV turned on when they wanted. Even today, relatively few people care about the electricity used.

That may change with time. If the power cost goes too high, people will pay more attention to the labels that tell how power is used and what the extimated cost per year is to run the thing. The information is usually there for people who care to look for it, but how many bother to look?
 

waylander

Distinguished
Nov 23, 2004
1,649
0
19,790
Well. A PSU that works at 115V and needs 8A draws 920W. If it outputs 500W drawing 920W that means roughly 55% efficiency, right?

You have to realize that the psu is NOT using all of the 115V, that is just what the wall is supplying. The psu converts that into 12v, 5v and 3.3v, this is where the efficiency comes into play. That is why your efficiency numbers are off. If you look at any given psu, lets make one up... you'll see the 3.3v line supplies 22amps, the 5v line supplies 20amps and the 12v lines (combined) supply 30amps. This adds up to 532watts. If you want it to be the most efficient then you put 75% (between 70 and 90%)load which is 399watts but if your psu is 80% efficient then you are actually pulling 498watts from the wall.

Does that make sense?