According to NVIDIA's own specifications a single reference clocked GeForce GTX 570 has a Graphics Card Power of 219 Watts. This means it will draw up to 18.25 Amps from the +12 Volt rail(s) of the power supply unit. NVIDIA also recommends a minimum 550W or greater system power supply with a +12 Volt current rating of at least 38 Amps or greater.
For two GeForce GTX 570 in 2-way SLI a minimum 800W or greater system power supply with a +12 Volt current rating of at least 57 Amps or greater is recommended.
The Cooler Master Silent Pro Gold 700W (RS-700-80GA-D3) is able to provide 56 Amps on its single +12 Volt rail and should, in theory, be capable of handing two GeForce GTX 570 in 2-way SLI.
You MIGHT be pushing it with that. Under Nvidia's SLI site, they don't have 570 listed but they do have dual 470's. I went with that as they consume roughly the same amount of power. The CoolerMaster Silent Gold 800W is listed but not the 700. However, the Silent Pro M700 is listed. ( http://www.slizone.com/object/slizone_build_psu.html ). If you are overclocking you always have to take that into account. In my opinion it is not safe to use the 700W. I'd look at a 800W gold or 850W bronze PSU. PS, in your system config it says you have an 800W CM Silent Gold. Is this a different PS in question? For a highend system that you are overclocking, it's always better to have MORE than sorta be on that fringe of OK.
So tell us why Tom's used a 1,000 Watt power supply unit when the system power using the Nvidia GeForce GTX 590 never exceeded 475 Watts? Even with two Nvidia GeForce GTX 580 in SLI the system power never exceed 590 Watts.
Why did Guru3D use a 1,200 Watt power supply unit for their testing, if you think power supply wattage is so critical?