750W vs 850W

seanmc122

Distinguished
Mar 5, 2011
12
0
18,510
Is there a huge difference in what I am going to be able to do between having 850W vs. 750W? I plan on having 2x GTX570 Superclocked with an i5 2500k overclocked to about 4.6ghz using water cooling. 128GB SSD with 2TB WD Green Caviar. Will it make a difference?
 
For a system using a single nVIDIA reference design GeForce GTX 570 graphics card NVIDIA specifies a minimum of a 550 Watt or greater power supply that has a combined +12 Volt continuous current rating of 38 Amps or greater and that has at least two 6-pin PCI Express supplementary power connectors.

For a system using two nVIDIA reference design GeForce GTX 570 graphics cards in 2-way SLI mode NVIDIA specifies a minimum of a 800 Watt or greater power supply that has a combined +12 Volt continuous current rating of 56 Amps or greater and that has at least four 6-pin PCI Express supplementary power connectors.

Total Power Supply Wattage is NOT the crucial factor in power supply selection!!! Total Continuous Amperage Available on the +12V Rail(s) is the most important factor.

Since you will be using two factory overclocked GeForce GTX 570 graphics cards and you may even want to manually further overclock them then a combined +12 Volt continuous current rating of 70 Amps or greater would be my recommendation. The manually overclocked graphics cards should be able to survive running Furmark without the power supply shuting down due to lack of +12V rail capacity.

Reputable name brand 850 Watt or greater power supplies will have at least a combined +12 Volt continuous current rating of 70 Amps or greater have the appropriate number of PCI Express supplementary power connectors.