Page 1:GeForce GTX 680: The Card And Cooling
Page 2:GK104: The Chip And Architecture
Page 3:GPU Boost: Graphics Afterburners
Page 4:Overclocking: I Want More Than GPU Boost
Page 5:PCI Express 3.0 And Adaptive V-Sync
Page 6:Hardware Setup And Benchmarks
Page 7:Benchmark Results: 3DMark 11 (DX 11)
Page 8:Benchmark Results: Battlefield 3 (DX 11)
Page 9:Benchmark Results: Crysis 2 (DX 9/DX 11)
Page 10:Benchmark Results: The Elder Scrolls V: Skyrim (DX 9)
Page 11:Benchmark Results: DiRT 3 (DX 11)
Page 12:Benchmark Results: World Of Warcraft: Cataclysm (DX 11)
Page 13:Benchmark Results: Metro 2033 (DX 11)
Page 14:Benchmark Results: Sandra 2012
Page 15:Benchmark Results: Compute Performance In LuxMark 2.0
Page 16:Benchmark Results: NVEnc And MediaEspresso 6.5
Page 17:Temperature And Noise
Page 18:Power Consumption
Page 19:Performance Per Watt: The Index
Page 20:GeForce GTX 680: The Hunter Scores A Kill
PCI Express 3.0 And Adaptive V-Sync
PCI Express 3.0: One Last Perf Point
GeForce GTX 680 includes a 16-lane PCI Express interface, just like almost every other graphics card we’ve reviewed in the last seven or so years. However, it’s one of the first boards with third-gen support. All six Radeon HD 7000 family members preempt the GeForce GTX 680 in this regard. But we already know that, in today’s games, doubling the data rate of a bus that isn’t currently saturated doesn’t impact performance very much.
By default, GTX 680 runs in X79 at PCIe 2.0 data rates Enabling PCIe 3.0 is achieved through a driver update.
Nevertheless PCI Express 3.0 support becomes a more important discussion point here because Nvidia’s press driver doesn’t enable it on X79-based platforms. The company’s official stance is that the card is gen-three-capable, but that X79 Express is only validated for second-gen data rates. Drop it into an Ivy Bridge-based system, though, and it should immediately enable 8 GT/s transfer speeds.
Nvidia sent us an updated driver to prove that GeForce GTX 680 does work, and indeed, data transfer bandwidth shot up to almost 12 GB/s. Should Nvidia validate GTX 680 on X79, a new driver should be the answer. In contrast, the data bandwidth of AMD’s Radeon HD 7900s slides back from what we’ve seen in previous reviews. Neither AMD nor Gigabyte is able to explain why this is happening.
Adaptive V-Sync: Smooth Is Good
When we benchmark games, we’re perpetually looking for ways to turn off vertical synchronization, or v-sync, which creates a relationship between our monitors’ refresh and graphics card frame rate. By locking our frame rate to 60 FPS on a 60 Hz LCD, for example, we wouldn’t be conveying the potential performance of a high-end graphics card capable of averaging 90 or 100 FPS. In most titles, turning off v-sync is a simple switch. In others, we have to hack our way around the feature to make the game testable.
In the real world, however, you want to use v-sync to prevent tearing—an artifact that occurs when in-game frame rates are higher than the display’s refresh and you show more than one frame on the screen at a time. Tearing bothers gamers to varying degrees. However, if you own a card capable of keeping you above a 60 FPS minimum, there’s really no downside to turning v-sync on.
Dropping under 60 FPS is where you run into problems. Because the technology is synchronizing the graphics card output with a fixed refresh, anything below 60 Hz has to still be a multiple of 60. So, running at 47 frames per second, for instance, actually forces you down to 30 FPS. The transition from 60 to 30 manifests on-screen as a slight stutter. Again, the degree to which this bothers you during game play is going to vary. If you know where and when to expect the stutter, though, spotting it is pretty easy.
Nvidia’s solution to the pitfalls of running with v-sync on or off is called adaptive v-sync. Basically, any time your card pushes more than 60 FPS, v-sync remains enabled. When the frame rate drops below that barrier, v-sync is turned off to prevent stuttering. The 300.99 driver provided with press boards enables adaptive V-sync through a drop-down menu that also contains settings for turning v-sync on or off.
Given limited time for testing, I was only really able to play a handful of games with and without v-sync, and then using adaptive v-sync. The tearing effect with v-sync turned off is the most distracting artifact. I’m less bothered when v-sync is on. Though, to be honest, it takes a title like Crysis 2 at Ultra quality to bounce above and below 60 FPS with any regularity on a GeForce GTX 680.
Overall, I’d call adaptive v-sync a good option to have, particularly as it permeates slower models in Nvidia’s line-up, which are more likely to spend time under the threshold of a display’s native refresh rate.
- GeForce GTX 680: The Card And Cooling
- GK104: The Chip And Architecture
- GPU Boost: Graphics Afterburners
- Overclocking: I Want More Than GPU Boost
- PCI Express 3.0 And Adaptive V-Sync
- Hardware Setup And Benchmarks
- Benchmark Results: 3DMark 11 (DX 11)
- Benchmark Results: Battlefield 3 (DX 11)
- Benchmark Results: Crysis 2 (DX 9/DX 11)
- Benchmark Results: The Elder Scrolls V: Skyrim (DX 9)
- Benchmark Results: DiRT 3 (DX 11)
- Benchmark Results: World Of Warcraft: Cataclysm (DX 11)
- Benchmark Results: Metro 2033 (DX 11)
- Benchmark Results: Sandra 2012
- Benchmark Results: Compute Performance In LuxMark 2.0
- Benchmark Results: NVEnc And MediaEspresso 6.5
- Temperature And Noise
- Power Consumption
- Performance Per Watt: The Index
- GeForce GTX 680: The Hunter Scores A Kill