Power And Heat
The GeForce GTX 580 GPU consumes so much energy that Nvidia implemented a feature that throttles the GPU in unrealistic workloads like FurMark. You might have heard that AMD did the same thing with its Radeon HD 6970, but this editor’s tests have shown that capability can be worked around if a thermal issue isn't encountered.
For the first time, we're having to use 3DMark to measure power use on our $2000 machine rather than FurMark. With so much going on with the GeForce GTX 580’s voltage regulator, the peak power numbers from the other machines are more reliable.
A pair of GeForce GTX 580s consumes monster power, even at this questionably lighter load. Idle power is part of the mix, however, since we believe most users don't use 100% of their system’s performance every possible minute of its operation.
Anyone questioning the $2000 PC’s use of a mediocre CPU cooler as a potential reason for its poor overclock can stop here, as we’ve reached far higher overclocks with Sandy Bridge-based processors running over 30° warmer.
The $1000 machine’s CPU temperature goes up significantly with overclocking due to the added stress on its small cooler, while the $2000 PC gets cooler when overclocked due to smart fan function being disabled. The overclocked $500 machine's temperature results are less surprising for those who remember that it was overclocked without a voltage increase.