I read this: http://ph.answers.yahoo.com/question/index?qid=20120529... and on one of the answers on the bottom, someone said, "850W. My GTX570 has only pulled 219Ws max...the 500W requirement is just bs" How does a person find out how many watts a video card draws? a PCIe x16 only has 75W and a 2 6 pin connectors have 75 each, so that's 225 in total. Why would AMD or NVIDIA make their power supply requirements 300 or 400W more than what you would need?
First of all those are requirements for whole system, not just the card.
Secondly, they exaggerate them to, and they do that to be safe, so nobody would sue them if the card does not work due to power supply. There are many crappy power supplies in the market that cannot deliver that wattage they promise to:
a psu 850w power rating means nothing. All that matters is it's rating on the 12v rail. Generally, a 500W psu only has somewhere between 360w-460w output on the 12v rail. Since your motherboard, cpu, gpu all get powered by the 12v rails, it is essential to have enough amp/volts on that. Hence, 500w labelled psu is the safe margin. Some dodgy brand psu like the shaw only outputs 300w even though its labelled as 680w.
AMD and nVidia tends to "exaggerate" the recommended power supply because:
1. They do not know exactly what components you have in your system.
1a. AMD CPUs tends to use more power than Intel CPUs.
1b. How many hard drives do you have? I have 6 + 2 optical drives. They all require power.
1c. What kind of motherboard do you have? A "budget" mobo has less features so it tends to consume relatively little power (35w) compared to a "deluxe premium" mobo that has many features that can consume much more power (70w - 95w).
2. Do you overclock? This uses more power.
3. Do you have a crappy, average or high quality power supply? I stick with high quality.