Sign in with
Sign up | Sign in
Your question
Solved

Can someone explain total power consumption for me?

Last response: in Components
Share
November 9, 2013 6:58:36 AM

I was looking at graphics cards the other day and came across one that topped out at 360 watts. The supplier recommended a 750 watt PSU, but the reviews clearly stated that was overkill. I've been told that you want a 12v 42a rating at least on a card like that, which is what I have (mine is 600watts with 12v 46a).

How do you guys calculate that? I'd like to start doing it on my own. I have no idea how much a CPU, RAM, HDD, and DVD/Blu Ray drive use by themselves.

Best solution

a b U Graphics card
November 9, 2013 8:07:36 AM

Amperage is not what you should be looking at, or at least not what you should be concerned about. To summarize, your +12V rail is the most important rail, and it usually powers around 90-95% of the computer. This is why people look at amperage on this rail, but an important thing to note is that V(olts)*A(mps)=W(attage). Most people go by either the amps are more important method or the 50% load method. Both of them are acceptable but neither looks at the full picture. The best way to do it would be to add up the wattage of all of your components (everything, fans included), and then purchase a power supply that has most of the load on it's +12v rail.

So if you added up all your components and the total wattage was say 530 (at full load) then you would want a PSU that can put about 530 (probably a bit more just to be safe) on it's +12v rail. So you have a 600W PSU with a 46a +12v rail. As we know from earlier:
W=A*V
We have the amperage and the voltage of the +12v rail so we put those into the equation
W=46*12
W= 552 This is the maximum wattage that can run through your +12v rail.

So your PSU in that example would be fine and it would have some headroom.

Conclusion: Your PSU will most likely be able to handle that GPU without any problems. In the future, you should try adding up the potential max wattage and then mess with all the calculations to see if your PSU will be able to handle it. If I didn't explain this clearly enough then please ask me any questions and I'll do my best to answer them.
Share
a c 201 U Graphics card
November 9, 2013 8:10:14 AM

I generally look at the CPU and the GPU, you can use their TDP(thermal design power), aka how much heat it dissipates and since all power consumed by a component is dissipated as heat this tells you how much power it is drawing. Bluray/DVD, RAM, and HDDs all use less than about 10 W each, your motherboard could pull up to 50W.

The reason manufacturers say that you should have a wayyyy bigger PSU than you actually need is because places still sell POS like this
http://www.newegg.com/Product/Product.aspx?Item=N82E168...
4/5 eggs? Not bad. 650W rated, it must be good enough! NOPE! 38A combined power on the 12V rail, thats 456 watts of power on the rail that provides >80% of the power to modern computers. The problem is most people don't know what to look for on the PSU label so the manufacturers build a big enough fudge factor into their recommendations that if you use the recommended wattage you aren't likely to find a unit crappy enough to not be able to power it.


One important thing to note when looking at reviews that are measuring total power consumption, they are always measuring at the wall, this means the PSU efficiency is a factor. Now most test systems are using 80+ gold or platinum rated units so their efficiency is close to 90% so this is less of an issue than it used to be when the test systems had 80% PSUs, but when you see them saying that the total power consumption was 500W that actually means that only about 450W was being pulled from the PSU, for older reviews or reviews with less efficient PSUs this would have meant only about 400W was being pulled from the PSU so power at the wall gave a significantly inflated power estimate.
m
0
l
Related resources
November 9, 2013 8:21:34 AM

Hi,

The graphics card manufacturer always recommends a power-supply that have sufficient effect for the whole computer, which often can be more than necessary.

On the other hand, there is a best practice to have a power-supply can deliver more the total amount of power under full load than he computer draws.

For example if the total power of a computer under full load, especially the graphics card, is around 360W, then a power-supply of minimum 450W, but better still 500W-650W preferable.
That is because the power-supply have more efficiency if it works in 50-70 % of its capacity, rather than working on the very edge of its capacity.

Another benefit is that there still will be enough power to use in the future for upgrades of the computer.

Now, please remember that for power hungry graphics cards, it's always the ampére (A) that is important on the 12V rail from the power-supply.

Here is a good calculator you can use, just to give you an idea of your computers power consumption but of it will not be accurate.
eXtreme Power Supply Calculator

Quote:
The recommended total Power Supply Wattage gives you a general idea on what to look for BUT it is NOT a crucial factor in power supply selection! Total Amperage Available on the +12V Rail(s) is the most important, followed by the +5V amperage and then the +3.3V amperage.


Best Regards from Sweden
m
0
l
a b U Graphics card
November 9, 2013 8:32:30 AM

Flyfisherman said:

For example if the total power of a computer under full load, especially the graphics card, is around 360W, then a power-supply of minimum 450W, but better still 500W-650W preferable.
That is because the power-supply have more efficiency if it works in 50-70 % of its capacity, rather than working on the very edge of its capacity.


The 50% load idea is a myth, because in reality, you'll get a higher average efficiency (very small but there) from a lower capacity PSU. Also remember, no one runs their computer at 100% load 100% of the time. Head room is important, but not double the wattage or anywhere near it. 50-100W is a good point to shoot for. As far as future builds, have you noticed that while computers take more power than they used to most GPUs stay in the 200-300 W range across generations? Each new generation does not bring more power needs with it.
m
0
l
November 9, 2013 10:54:58 AM

So many great answers. Thanks guys, I understand it a lot better now.
m
0
l
!