Nvidia’s idle power consumption is just as impressive as AMD’s. Its GeForce GTX 680 sits right between the Radeon HD 7970 and 7950 on Windows’ desktop.
However, Nvidia doesn’t enjoy the benefits of AMD’s ZeroCore technology once our test platform shuts off its display. Both Radeon HD 7900s shed an additional 13-16 W, while GeForce GTX 680 only drops two. That’s certainly an improvement from GeForce GTX 580, but AMD unquestionably holds the advantage here.
Now, here’s where stuff gets real.
As we saw several times in the performance benchmarks, the GeForce GTX 680 actually comes close to matching the performance of GeForce GTX 590 and Radeon HD 6990 in a few situations, and can even beat them in a title like DiRT 3. But look at the difference in power consumption.
The GeForce GTX 590—even though we think Nvidia did a fair job keeping it cool and quiet—is an ugly power hog. The Radeon HD 6990, which isn’t cooled well or kept quiet at all, is a little better. But still. Yuck.
As we start looking at the single-GPU cards, the situation improves. GeForce GTX 680 sits somewhere between Radeon HD 7970 and 7950—both cards that we’ve already observed to offer substantially better performance per watt than the previous champion, GeForce GTX 580.
What’s also interesting about the 3DMark demo, specifically, is that it throws two different workloads at our contenders. The GeForce GTX 590, in particular, shows that Deep Sea incurs more of a power cost, though all of the cards demonstrate some degree of inconsistent power consumption. Meanwhile, Nvidia’s GeForce GTX 680 gives us a fairly straight line all the way across, illustrating that GPU Boost is continually adjusting clocks/voltage to operate within its power envelope.
But although this information is interesting as theory, it’s not particularly telling of performance per watt. So, let’s dice this up another way…