Ah, I was not saying you can't use a 500W at anything over 350W. I was under impression that 500W was meant to cover max wattage of a PSU, not what it would be providing during most of the use. I did not know that...500W meant that max/peak wattage is actually higher? That first link to over current protection...is that ..but that's a safety level, no? As in what level before current just fries it?
I don't think running PSU past 60% load is bad but the higher the load the higher the heat, and the higher the heat, the more efficiency drops. So if you do hours of gaming on your machine every day, I thought running it at 80-90% of its load is actually less efficient and generates more heat so that you'd be better off with a bit more headroom? Or ..do I have that wrong? This (though again not a reliable source) https://hardforum.com/threads/on-psu-efficiency.1575419/ says better what I mean.
I was under impression that like most electrical components, with time, and constant heat generation, the capacity for power conversion of a PSU lowers in efficiency ie. if it started off with 80% rating, it won't be that 10 years down the road, and as a result same system will actually draw more effective power from the outlet, which at some point without overhead may exceed safe max wattage of a psu?
I thought this had to do with transformer performance over time with constant heat exposure, not capacitors? Capacitors in my mind have to do with dc vs ac conversion. Maybe I'm getting this wrong, I won't pretend this is my specialty.