
Nvidia’s idle power consumption is just as impressive as AMD’s. Its GeForce GTX 680 sits right between the Radeon HD 7970 and 7950 on Windows’ desktop.

However, Nvidia doesn’t enjoy the benefits of AMD’s ZeroCore technology once our test platform shuts off its display. Both Radeon HD 7900s shed an additional 13-16 W, while GeForce GTX 680 only drops two. That’s certainly an improvement from GeForce GTX 580, but AMD unquestionably holds the advantage here.

Now, here’s where stuff gets real.
As we saw several times in the performance benchmarks, the GeForce GTX 680 actually comes close to matching the performance of GeForce GTX 590 and Radeon HD 6990 in a few situations, and can even beat them in a title like DiRT 3. But look at the difference in power consumption.
The GeForce GTX 590—even though we think Nvidia did a fair job keeping it cool and quiet—is an ugly power hog. The Radeon HD 6990, which isn’t cooled well or kept quiet at all, is a little better. But still. Yuck.
As we start looking at the single-GPU cards, the situation improves. GeForce GTX 680 sits somewhere between Radeon HD 7970 and 7950—both cards that we’ve already observed to offer substantially better performance per watt than the previous champion, GeForce GTX 580.
What’s also interesting about the 3DMark demo, specifically, is that it throws two different workloads at our contenders. The GeForce GTX 590, in particular, shows that Deep Sea incurs more of a power cost, though all of the cards demonstrate some degree of inconsistent power consumption. Meanwhile, Nvidia’s GeForce GTX 680 gives us a fairly straight line all the way across, illustrating that GPU Boost is continually adjusting clocks/voltage to operate within its power envelope.
But although this information is interesting as theory, it’s not particularly telling of performance per watt. So, let’s dice this up another way…
- GeForce GTX 680: The Card And Cooling
- GK104: The Chip And Architecture
- GPU Boost: Graphics Afterburners
- Overclocking: I Want More Than GPU Boost
- PCI Express 3.0 And Adaptive V-Sync
- Hardware Setup And Benchmarks
- Benchmark Results: 3DMark 11 (DX 11)
- Benchmark Results: Battlefield 3 (DX 11)
- Benchmark Results: Crysis 2 (DX 9/DX 11)
- Benchmark Results: The Elder Scrolls V: Skyrim (DX 9)
- Benchmark Results: DiRT 3 (DX 11)
- Benchmark Results: World Of Warcraft: Cataclysm (DX 11)
- Benchmark Results: Metro 2033 (DX 11)
- Benchmark Results: Sandra 2012
- Benchmark Results: Compute Performance In LuxMark 2.0
- Benchmark Results: NVEnc And MediaEspresso 6.5
- Temperature And Noise
- Power Consumption
- Performance Per Watt: The Index
- GeForce GTX 680: The Hunter Scores A Kill
Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released
Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released
Good going Nvidia...
Sigh...
WoW has had DX11 for quite a long time now. Also, go play in a 25 man raid with every detail setting on ultra with 8xAA and 16x AAF and tell me WoW is not taxing on a PC.
...oh, wait.
For everyone suggesting that nVidia will release another true "flagship" beyond the 680, I think you are spot on, IF AMD gives them a reason to. There's no reason to push it at the moment as they already hold the crown. If, on the other hand, AMD goes out and makes a 7980, or 79070 SE card with higher clocks (more like what the 7970 can achieve when properly overclocked), I definitely see nVidia stepping their game up a bit.
Either way, it's awesome to see both AMD and now nVidia taking power consumption into consideration. I'm tired of my computer room feeling like a toaster after an all nighter.
He means waiting for the GK110, that will be a more of a compute card while this GK104 is more equiped towards gaming.