Ferrariassassin :
I understand that if it has more, you pay more but i just never could understand why a Power Supply coast goes up when it is a higher wattage when almost every household puts out amps which is much more than mere Watts and Volts. It confuses me because the way i see it they use less parts in a Power Supply that has a higher voltage because for example, a 200W Power Supply will need much more parts to block and shield away the extra power coming in from the wall outlet and a 1,000W Power Supply would need less parts to block and shield the extra power from the wall outlet since it lets out more power, thus making it coast cheaper to make. I know i am probably completely wrong about this in every way, could someone please explain how this works?
Hi there OP,
This is a great set of questions. I'll try and answer them in an easily understood fashion.
The power that is delivered to houses is in the form of periodic sinusoid. In North America this sinusoid swings between approximately +170 volts and -170 volts relative to ground around 60 times per second. This kind of power is perfectly fine for the kind of applications that it was originally designed for, such as heating elements, lights, and electric motors, but it is not suitable for electronics. There are several reasons why it is unsuitable:
1. The voltage is way too high. Voltage is a measure of the amount of energy that a charge carrier absorbs or dissipates as it passes between two points. A single electron that travels from a 170 volt potential to a 0 volt potential will dissipate 170 electron volts worth of energy. A coulomb of charges passing through the same potential will dissipate 170 joules of energy. This is far, far too much energy for electronics to work with, so it must be reduced to something more reasonable, usually between 1 and 12 volts.
2. The voltage isn't steady. Most electronics are designed around steady operation, they aren't designed to have their supply voltage constantly changing and many are not tolerant of having the polarity of the supply voltage reversed. AC power (the sinusoidal delivery mechanism) is extremely suitable for power transmission, but it is not suitable for electronics. AC power must be converted to steady DC power at nominally fixed voltage levels.
3. The voltage isn't highly predictable. Even if an circuit oscillates predictably between two levels 60 times per second, there's often a large amount of noise riding on top of the signal. This noise adds a random offset to the amplitude of the signal. Noise is unavoidable as it is partly a consequence of the environment, but it can be reduced.
So, the task of a computer power supply is as follows:
1. Rectify noisy high amplitude AC power input into steady DC power output
2. Provide stable DC power output that is minimally variant with the attached load
3. Suppress noise and AC ripple on the DC output as much as possible
4. Provide safety features to protect the user and prevent damage to the attached components in the event of a failure
Believe it or not, these tasks are not particularly hard to do. Modern phone chargers are approximately the size of a thumb and serve to convert 120 Volts AC @ 60hz (North America, other parts of the world use different standards) to steady 5 volts DC.
You may notice however that in my examples above I used two quantities, a single electron (negative charge carrier) and a coulomb (6.241 * 10^18 electrons). Converting small amounts of energy is easy, converting large amounts is hard. That phone charger may be rated to deliver an output of 0.5 amperes at 5 volts (one ampere is defined as one coulomb per second).
Most high end PSUs can deliver upwards of 80 amperes on the all important +12 volt rail without the +12 volt rail dropping in amplitude. Attempting to draw more than 0.5 amperes from the phone charger would result in the amplitude of the output dropping below 5 volts, violating task #2 which specifies that the output be minimally variant with respect to the specified load.
Moving on, the laws of thermodynamics state that no electrical process can be 100% efficient. In other words, it takes energy to convert energy. The phone charger delivers up to 2.5 watts (one watt is defined as one joule per second) as is evidenced by it delivering up to 0.5 amperes at 5 volts. Another way of looking at it is 3.12 * 10^18 electrons each charged to approximately 5 electron volts relative to the reference point. However, in order to complete the rectification, level conversion, noise suppression, and safety the device must draw more power from the wall than it delivers to the device. The phone charger may draw 4 watts from the wall and deliver only 2.5 watts to the device. The remaining 1.5 watts is dissipated as other forms of energy, mostly heat. This device is only 62.5% efficient.
1.5 watts is not a large amount of heat for a small phone charger, but the same rules hold true for PC power supplies. An inefficient PSU can quite literally burn up and catch fire as a result of the amount of heat it generates. For example, a crummy 800 watt PSU that is only at most 70% efficient will draw up to 1040 watts from the wall, burning off 240 watts as heat in the process. That's as much heat as an AMD R9-280X generates.
Crummy low power PSUs can afford to be inefficient because it's less likely to be hazardous and less likely to cause damage to the device. However, high power PSUs must have a certain level of efficiency (accomplished in part by using high quality components) to be commercially viable. This is why most quality 1,000+ watt PSUs are in the $200-$250 range, they are between 90% and 95% efficient at their nominal operating conditions.
I hope that this post answered your question