Sign in with
Sign up | Sign in

PCI Express 3.0 And Adaptive V-Sync

GeForce GTX 680 2 GB Review: Kepler Sends Tahiti On Vacation
By

PCI Express 3.0: One Last Perf Point

GeForce GTX 680 includes a 16-lane PCI Express interface, just like almost every other graphics card we’ve reviewed in the last seven or so years. However, it’s one of the first boards with third-gen support. All six Radeon HD 7000 family members preempt the GeForce GTX 680 in this regard. But we already know that, in today’s games, doubling the data rate of a bus that isn’t currently saturated doesn’t impact performance very much.

By default, GTX 680 runs in X79 at PCIe 2.0 data ratesBy default, GTX 680 runs in X79 at PCIe 2.0 data ratesEnabling PCIe 3.0 is achieved through a driver update.Enabling PCIe 3.0 is achieved through a driver update.

Nevertheless PCI Express 3.0 support becomes a more important discussion point here because Nvidia’s press driver doesn’t enable it on X79-based platforms. The company’s official stance is that the card is gen-three-capable, but that X79 Express is only validated for second-gen data rates. Drop it into an Ivy Bridge-based system, though, and it should immediately enable 8 GT/s transfer speeds.

Nvidia sent us an updated driver to prove that GeForce GTX 680 does work, and indeed, data transfer bandwidth shot up to almost 12 GB/s. Should Nvidia validate GTX 680 on X79, a new driver should be the answer. In contrast, the data bandwidth of AMD’s Radeon HD 7900s slides back from what we’ve seen in previous reviews. Neither AMD nor Gigabyte is able to explain why this is happening.

Adaptive V-Sync: Smooth Is Good

When we benchmark games, we’re perpetually looking for ways to turn off vertical synchronization, or v-sync, which creates a relationship between our monitors’ refresh and graphics card frame rate. By locking our frame rate to 60 FPS on a 60 Hz LCD, for example, we wouldn’t be conveying the potential performance of a high-end graphics card capable of averaging 90 or 100 FPS. In most titles, turning off v-sync is a simple switch. In others, we have to hack our way around the feature to make the game testable.

In the real world, however, you want to use v-sync to prevent tearing—an artifact that occurs when in-game frame rates are higher than the display’s refresh and you show more than one frame on the screen at a time. Tearing bothers gamers to varying degrees. However, if you own a card capable of keeping you above a 60 FPS minimum, there’s really no downside to turning v-sync on.

Dropping under 60 FPS is where you run into problems. Because the technology is synchronizing the graphics card output with a fixed refresh, anything below 60 Hz has to still be a multiple of 60. So, running at 47 frames per second, for instance, actually forces you down to 30 FPS. The transition from 60 to 30 manifests on-screen as a slight stutter. Again, the degree to which this bothers you during game play is going to vary. If you know where and when to expect the stutter, though, spotting it is pretty easy.

Nvidia’s solution to the pitfalls of running with v-sync on or off is called adaptive v-sync. Basically, any time your card pushes more than 60 FPS, v-sync remains enabled. When the frame rate drops below that barrier, v-sync is turned off to prevent stuttering. The 300.99 driver provided with press boards enables adaptive V-sync through a drop-down menu that also contains settings for turning v-sync on or off.

Given limited time for testing, I was only really able to play a handful of games with and without v-sync, and then using adaptive v-sync. The tearing effect with v-sync turned off is the most distracting artifact. I’m less bothered when v-sync is on. Though, to be honest, it takes a title like Crysis 2 at Ultra quality to bounce above and below 60 FPS with any regularity on a GeForce GTX 680.

Overall, I’d call adaptive v-sync a good option to have, particularly as it permeates slower models in Nvidia’s line-up, which are more likely to spend time under the threshold of a display’s native refresh rate.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 327 comments.
This thread is closed for comments
Top Comments
  • 44 Hide
    borden5 , March 22, 2012 12:55 PM
    oh man this's good news for consumer, hope to see a price war soon
  • 38 Hide
    Anonymous , March 22, 2012 12:46 PM
    Hail to the new king.
  • 33 Hide
    outlw6669 , March 22, 2012 12:59 PM
    Nice results, this is how the transition to 28nm should be.
    Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released :/ 
Other Comments
  • 38 Hide
    Anonymous , March 22, 2012 12:46 PM
    Hail to the new king.
  • 44 Hide
    borden5 , March 22, 2012 12:55 PM
    oh man this's good news for consumer, hope to see a price war soon
  • 26 Hide
    johnners2981 , March 22, 2012 12:58 PM
    Damn prices, in europe we have to pay the equivalent of $650-$700 to get one
  • 33 Hide
    outlw6669 , March 22, 2012 12:59 PM
    Nice results, this is how the transition to 28nm should be.
    Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released :/ 
  • 23 Hide
    Anonymous , March 22, 2012 1:00 PM
    Finally we will see prices going down (either way :-) )
  • -4 Hide
    Scotty99 , March 22, 2012 1:03 PM
    Its a midrange card, anyone who disagrees is plain wrong. Thats not to say its a bad card, what happened here is nvidia is so far ahead of AMD in tech that the mid range card purposed to fill the 560ti in the lineup actually competed with AMD's flagship. If you dont believe me that is fine, you will see in a couple months when the actual flagship comes out, the ones with the 384 bit interface.
  • 26 Hide
    Chainzsaw , March 22, 2012 1:04 PM
    Wow not too bad. Looks like the 680 is actually cheaper than the 7970 right now, about 50$, and generally beats the 7970, but obviously not at everything.

    Good going Nvidia...
  • 32 Hide
    SkyWalker1726 , March 22, 2012 1:05 PM
    AMD will certainly Drop the price of the 7xxx series
  • 20 Hide
    rantoc , March 22, 2012 1:13 PM
    2x of thoose ordered and will be delivered tomorrow, will be a nice geeky weekend for sure =)
  • 23 Hide
    Scotty99 , March 22, 2012 1:21 PM
    scrumworksNothing surprising here. Little overclocking can put Tahiti right at the same level. Kepler is actually losing to Tahiti in really demanding games like Metro 2033 that uses the latest tech. Pointless to test ancient and low tech games like World of Warcrap that is ancient, uses dx9 and is not considered cutting edge in any meter.


    Sigh...

    WoW has had DX11 for quite a long time now. Also, go play in a 25 man raid with every detail setting on ultra with 8xAA and 16x AAF and tell me WoW is not taxing on a PC.
  • 16 Hide
    yougotjaked , March 22, 2012 1:21 PM
    Wait what does it mean by "if you’re interested in compute potential, you’ll have to keep waiting"?
  • 0 Hide
    dragonsqrrl , March 22, 2012 1:22 PM
    Just throwing this out there now, but some AMD fanboy will find a way to discredit or marginalize these results.

    ...oh, wait.
  • 24 Hide
    klausey , March 22, 2012 1:24 PM
    Great to see nVidia jumping back into the game and forcing AMD to lower its prices accordingly. I was shocked to see the card actually available at the MSRP of $500 on launch day. I guess we'll see how long that lasts.

    For everyone suggesting that nVidia will release another true "flagship" beyond the 680, I think you are spot on, IF AMD gives them a reason to. There's no reason to push it at the moment as they already hold the crown. If, on the other hand, AMD goes out and makes a 7980, or 79070 SE card with higher clocks (more like what the 7970 can achieve when properly overclocked), I definitely see nVidia stepping their game up a bit.

    Either way, it's awesome to see both AMD and now nVidia taking power consumption into consideration. I'm tired of my computer room feeling like a toaster after an all nighter.
  • 18 Hide
    rantoc , March 22, 2012 1:24 PM
    yougotjakedWait what does it mean by "if you’re interested in compute potential, you’ll have to keep waiting"?


    He means waiting for the GK110, that will be a more of a compute card while this GK104 is more equiped towards gaming.
  • 13 Hide
    EXT64 , March 22, 2012 1:26 PM
    Really disappointing DP compute, but a tradeoff had to be made and this card is meant for gaming, so I can understand their position. Hopefully GK110 is a real card and will eventually come out.
  • -7 Hide
    Anonymous , March 22, 2012 1:27 PM
    but will it run tetris with multiple displays?
  • 7 Hide
    amk-aka-Phantom , March 22, 2012 1:27 PM
    Oh yeah, team green strikes back! :D  Now let's see what 660 Ti will be like, might suggest that to a friend.
Display more comments