Is this PSU good for this build?

Solution


Your power draw should be around 450W-500W under load when gaming. Adding 300W to that is not close to 1000W, it's more like 750W like ko888 said. Plus I don't believe in all that "add 300W" stuff. I think it's mostly bologna and doesn't actually look at things objectively.

Fixadent

Commendable
Sep 22, 2016
307
0
1,780


I was told that it's always a good idea to have a PSU with a capacity that is a "few hundred watts" above the maximum expected power draw of your system for the stability and lifespan of the unit.

So I've always followed that advice.
 


Your power draw should be around 450W-500W under load when gaming. Adding 300W to that is not close to 1000W, it's more like 750W like ko888 said. Plus I don't believe in all that "add 300W" stuff. I think it's mostly bologna and doesn't actually look at things objectively.
 
Solution

Fixadent

Commendable
Sep 22, 2016
307
0
1,780


Might this be a better option? And is it also a tier one unit?

https://www.newegg.com/Product/Product.aspx?Item=n82e16817139044&nm_mc=KNC-GoogleKWLess&cm_mmc=KNC-GoogleKWLess-_-DSAFeed-All-Products-_-powersupplies-_-NA&ignorebbr&gclid=CJWuzpSNkdUCFYFqfgod0g4GJQ
 

Sedivy

Estimable
Actually I agree with that several hundred watts overhead. If in the future you want to add a newer video card (say the new vegas or voltas due to be out in 2017) or a cpu upgrade, you'd save yourself the expense of purchasing a new PSU. Not to mention that with time, effective wattage does drop, and so having some overhead from the start is not a terrible thing.
I'm not saying you can't get an 800W unit and be fine, just that I like planning my rigs for any future upgrades in advance as it often saves costs that way.
 
No, the AX 860 is old now and that is way overpriced. If you're getting a 750W unit the Corsair RM750X is a better PSU for nearly half the price https://www.newegg.com/Product/Product.aspx?Item=N82E16817139142&cm_re=corsair_rm750x-_-17-139-142-_-Product

You could even get a 550W one (of course your power depends on how much you overclock) but 750W works too.



People have said that but I've never seen from my research anything to confirm or validate that, or provide an explanation of why and what that even means since PSUs in the first place don't just run out of power but shut off under heavy loads from a protection. While over time performance may decrease, performance is not the same thing as effective wattage. Frankly, effective wattage isn't even a measured quantity.
 

Sedivy

Estimable
Performance is measured by its efficiency and its efficiency does reflect its wattage as it shows effective power vs power wasted as heat. Efficiency matters right off the start when purchasing, and does tend to go down over time. Is this somehow a contested issue?
 


Can you provide proof that a PSU's efficiency decreases as it ages?
 

Sedivy

Estimable
Like...a scientific paper? Lol no and I don't feel inclined to chase it down either. Here's an excerpt from a wiki on PSU (which is a shitty source but I really don't feel like hunting down references):
Total power requirements for a personal computer may range from 250 W to more than 1000 W for a high-performance computer with multiple graphics cards. Personal computers usually require 300 to 500 W.[10] Power supplies are designed around 40% greater than the calculated system power consumption. This protects against system performance degradation, and against power supply overloading. Power supplies label their total power output, and label how this is determined by the amperage limits for each of the voltages supplied. Some power supplies have no-overload protection.
So the way I read it is that they're designed to have overhead in power because performance degradation is to be expected. Are...can you show me otherwise? This I first heard about in my undergrad classes and if I'm totally wrong about this, I'd like to know.
 


Performance is measured through tests involving transient response measurements, voltage regulation (including crossloading), ripple checking, etc. Efficiency is efficiency.

Additionally, efficiency does not reflect its wattage since power supplies are rated by power output. Plus wattage isn't standardized anyway. If I ask you what a power supply's wattage is, you'd get dozens of different answers from different people.
 

Sedivy

Estimable
Wattage is a unit of power? How else do you even define power?
How does efficiency not reflect the wattage. Efficiency shows percentage of power not lost to heat. It's directly related?
If they design a system with 40% overhead power consumption to compensate for degradation, how do you read it?
And I'm not saying performance won't degrade in other ways. I just don't see why you're excluding efficiency in particular from this degradation?
 


They don't. Ask Jonnyguru who's in charge of PSUs at Corsair. Their warranties are for 24/7 100% load. So the Corsair RM550x, 10 year warranty. That's for 550W of load (DC not AC) constantly for 10 years. That's how they rate their warranties, the engineers believe under those conditions it'll last that long.

And you want overhead, it's there. This is why the Seasonic Prime Titanium 650W can do 925W of continuous load. Of course, I'm not recommending anybody does that, but you can.

If, like you think, you're not supposed to use a 500W power supply at anything over 350W, why would they call it a 500W PSU then instead of just calling it a 350W power supply? They don't because this "overhead" think is just stuff people say. At the end of the day power supplies are tested and reviewed by professionals, and they are shown to perform fine under 80% load, 90% load, 100% load. So why are people so fearful of running a PSU past like 60% load?

There actually is no guarantee a higher wattage unit will last longer like most things. PSUs are complex. Different ones have different fan profiles and components. It's not so straight cut and easy to tell.

BTW check this out, it's relevant http://www.jonnyguru.com/forums/showthread.php?t=13750 - it's literally everything we're discussing, and it's actually a thread I started myself in search of said answer to the longevity of a power supply.
 

Sedivy

Estimable
Ah, I was not saying you can't use a 500W at anything over 350W. I was under impression that 500W was meant to cover max wattage of a PSU, not what it would be providing during most of the use. I did not know that...500W meant that max/peak wattage is actually higher? That first link to over current protection...is that ..but that's a safety level, no? As in what level before current just fries it?
I don't think running PSU past 60% load is bad but the higher the load the higher the heat, and the higher the heat, the more efficiency drops. So if you do hours of gaming on your machine every day, I thought running it at 80-90% of its load is actually less efficient and generates more heat so that you'd be better off with a bit more headroom? Or ..do I have that wrong? This (though again not a reliable source) https://hardforum.com/threads/on-psu-efficiency.1575419/ says better what I mean.
I was under impression that like most electrical components, with time, and constant heat generation, the capacity for power conversion of a PSU lowers in efficiency ie. if it started off with 80% rating, it won't be that 10 years down the road, and as a result same system will actually draw more effective power from the outlet, which at some point without overhead may exceed safe max wattage of a psu?
I thought this had to do with transformer performance over time with constant heat exposure, not capacitors? Capacitors in my mind have to do with dc vs ac conversion. Maybe I'm getting this wrong, I won't pretend this is my specialty.
 

Sedivy

Estimable
Ah i found a link to the study mentioned in the thread you started:
https://www.hardocp.com/article/2015/03/09/silverstone_olympia_1000w_power_supply_7_year_redux/2
It does show that at high load, after 7 years it's unable to complete a full load test, and that the temps are overall higher, but that at lower loads, the wattage is about the same and doesn't drop appreciably...which is interesting. Not overall drop in wattage outputted but more of a reduction in maximum that it's able to output.
 
I have seen that Olympia review where they check back after 7 years. I understand that it shuts off in Test 4 whereas it didn't 7 years before. But what they fail to reveal in that review is in what manner it shut down, or why it shut down. It seems it did not explode or anything like that. That means the only possible explanation for it shutting down was one of the protection circuits shut it down.

Overcurrent protection/overpower protection can be ruled out. Those shut down the power supply at a certain load, and 7 years difference would have no effect on that. So it had to have been something like undervoltage protection that shut down the power supply. It's hard to say since the author of that review didn't seem interested on expanding more than just saying "it didn't finish this test". Like, I want to know why.

Efficiency as I predicted was about a 0.5-1% decrease. no big deal there.

Yes, you're right that generally running the PSU at a higher load will make it less efficient and therefore more hot. The only question which is often difficult to answer is will paying more money for a higher-wattage unit end up saving you money in the long run? Additionally, will it last longer? The former can be calculated quite easily based on how often you plan to use the computer. The latter is not easy to say.

Even if a higher wattage unit does last longer, the lower wattage one may still last plenty long. So is it worth it to get the higher wattage one based on conjecture that it'll last significantly longer? The question is more of what's better for the money, which has better value.

The way I think of it is if you're paying 20% more for a higher wattage unit you better expect it to last 20% longer. Unfortunately, though, I don't believe that would happen in the real world. Probably both would last long. High quality PSUs should last long regardless of their wattage or load. But you do bring up good points. I personally wouldn't recommend a 500W PSU if someone's system load was exactly 500W just because. But I would like to see more units tested like that Olympia was.
 


It's not entirely clear what you mean by this. Wattage output is whatever the reviewer sets the load tester to. The reviewer configures it to 700W, boom, the PSU is supplying 700W. I really don't understand what you mean by "at lower loads, the wattage is about the same" - the definition of a low load is low wattage.



Not fries it but shuts itself down so it doesn't end up getting fried.
 

Sedivy

Estimable
Even if a higher wattage unit does last longer, the lower wattage one may still last plenty long. So is it worth it to get the higher wattage one based on conjecture that it'll last significantly longer? The question is more of what's better for the money, which has better value.
Oh i wouldn't get higher one for that reason. Quality of the components in my mind is what gets it to last longer. I get overhead because I count that at some point I will upgrade gpu, cpu or both and do not want to have to think about entire new PSU, when I can just get a slightly higher power at the start, thus saving me money in the long run. Though i did think that efficiency would drop over time as well which apparently is not the case, at least not to a great extent.
So it had to have been something like undervoltage protection that shut down the power supply. It's hard to say since the author of that review didn't seem interested on expanding more than just saying "it didn't finish this test".
Hum...this is a good point. I assumed that all things being equal between tests, when they loaded it so that 1007W was required/drawn, it couldn't supply it and shut down therefore this implied max power that can be supplied by it dropped over time
This is what I meant by load. Yes they show at 700W the efficiency drop was tiny (less than a percent), but that it wasn't able to supply 1000W anymore. Did I..maybe I said that wrong? Why undervoltage protection? I..was looking at 120V scenario. Are you talking about drop in voltage over individual rails on the psu? They do mention in this: https://www.hardocp.com/article/2007/02/25/hard_look_at_power_supplies/4 some parameters of how they define fail on a test, (5 vs 10% drop from the nominal voltage). Sorry didn't understand that entirely.

I thought their conclusion was that while efficiency was good at lower wattage (so lower load by the system?) at higher loads (usage? Watts? Power draw?) it wasn't able to supply that power anymore. Because their system didn't come anywhere close to drawing 1000W, they never noticed, but did on tests. Or that's what I thought this meant anyway:
n. The main issue we find here in the load testing is that this unit is no longer able to do full power at 45C or even at 22C when it was retested. Certainly, that is a change worth noting and more significant than the tiny voltage regulation change numbers. We would like to interject an important note here though, this unit has been nothing but rock solid over the last 7 years when running so, clearly, the systems it was powering were not drawing near the total rated capacity and as such I have never noticed this until putting it back on the load tester
But yes, it's hard to draw solid conclusion from one test done on one power supply.