G-Sync Technology Preview: Quite Literally A Game Changer

Testing G-Sync Against V-Sync Enabled

So now it's time to put G-Sync to the test. Bust out the video capture card, multi-SSD array, and get to benchmarking, right?

Wrong.

This isn't a performance story. It's a quality one. In this case, benchmarking only tells us one thing: where the frame rate is at any given point. What it doesn't tell us is how good the experience with or without G-Sync really is. And so we have to rely on carefully-written and eloquently delivered words. I'll try to make it as painless as possible.

Why not simply record video and let you watch for yourself? A camera records at a fixed 60 Hz. So too does your monitor play back at a constant 60 Hz. Because G-Sync is variable, you won't see the technology in action.

Given enough games, there is a seemingly endless number of permutations we could run. V-sync on, V-sync off, G-Sync on, G-Sync off, 60 Hz, 120 Hz, 144 Hz...the list goes on and on. But we'll start by setting our prototype screen to a 60 Hz refresh rate and gaming with V-sync enabled.

Perhaps the easiest place to start is Nvidia's own demo tool, a pendulum that swings back and forth. It can be set to a simulated 60, 50, or 40 FPS. Or, you can vacillate between 40 and 60. With the picture at any of those settings, you toggle between no V-sync, V-sync enabled, and G-Sync. Contrived though this test may be, it's probably the most dramatic example of the technology possible. You can watch the scene at 50 FPS with V-sync on and think, "Yeah, that's not too bad; I see what appears to be a stutter, but I could live with then." Then, G-Sync is switched on and you slap yourself. "What was I thinking? That's a night-and-day difference. How could I have been alright with that before?"

But then you pinch yourself and remember that this is a tech demo. You want your evidence steeped in real-world gaming. So you fire up something you know is going to be taxing, like Arma III.

In Arma III, I can drop a GeForce GTX 770 into my test machine and dial in the Ultra detail preset. With V-sync off, that's good for frame rates in the 40s or 50s. Turn V-sync on, though, and you're forced down to 30 FPS. Performance isn't good enough that you see lots of fluctuation between 30 and 60 FPS. Instead, the card's frame rate is just neutered.

Because there wasn't any real stuttering before, what you see on-screen with G-Sync enabled isn't significantly different, except that practical performance jumps between 10 and 20 FPS higher. Input lag would be expected to decrease as well, since you no longer have the same frames displayed for multiple monitor scans. I find Arma to be less twitchy than a lot of other games though, so I didn't feel much latency.

Metro: Last Light, on the other hand, makes G-Sync more apparent. Running on a GeForce GTX 770, the game can be set to 1920x1080 at Very High details with 16x AF, normal tessellation, and normal motion blur. From there, you can tweak the SSAA setting from 1x to 2x to 3x to erode away the frame rate incrementally. 

It also helps that where the game starts you off includes a hallway that's easy to strafe up and down. I fired the level up with V-sync on at 60 Hz and went to town. Fraps reported frame rates down below 30 FPS with SSAA at 3x, and up closer to 60 FPS with SSAA off. Both the stutter and lag are significant. Disable SSAA and you walk around with complete fluidity pinned at 60. Switch to 2x SSAA, though, and the variation from 60 to 30 FPS makes each duplicated frame a painful jerk. This is one of those games I'd set to V-sync off, and simply ignore the tearing. I've been doing that for years; it's something I'm just used to.

But G-Sync makes all of that pain go away. You don't find yourself staring at the Fraps counter, looking for evidence of a dip below 60 FPS that'd compel you to turn off another detail setting. Rather, you can turn then up higher because, even if you do dip to 50 or even 40 FPS, you don't end up with any of that unnatural stutter. And the previous solution, disabling V-sync, well, we'll get to that on the next page.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
176 comments
    Your comment
    Top Comments
  • I'm on page 4, and I can't even contain myself.

    Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.

    This is awesome, outside-of-the-box thinking tech.

    I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
    35
  • Competition, competition. Anybody who is flaming over who is better: AMD or nVidia, is clearly missing the point. With nVidia's G-Sync, and AMD's Mantle, we have, for the first time in a while, real market competition in the GPU space. What does that mean for consumers? Lower prices, better products.
    23
  • I consider Gsync to be the most important gaming innovation since DX7. It's going to be one of those "How the HELL did we live without this before?" technologies.
    16
  • Other Comments
  • I consider Gsync to be the most important gaming innovation since DX7. It's going to be one of those "How the HELL did we live without this before?" technologies.
    16
  • Totally agree, G Sync is really impressive and the technology we have been waiting for.
    What the hell is Mantle?
    -12
  • I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.

    I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.

    At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider.
    16
  • Competition, competition. Anybody who is flaming over who is better: AMD or nVidia, is clearly missing the point. With nVidia's G-Sync, and AMD's Mantle, we have, for the first time in a while, real market competition in the GPU space. What does that mean for consumers? Lower prices, better products.
    23
  • This needs to be not so proprietary for it to become a game changer. As it is, requiring a specific GPU and specific monitor with an additional price premium just isn't compelling and won't reach a wide demographic.

    Is it great for those who already happen to fall within the requirements? Sure, but unless Nvidia opens this up or competitors make similar solutions, I feel like this is doomed to be as niche as lightboost, Physx, and, I suspect, Mantle.
    7
  • g sync tv pleeeeeeeeeease
    -6
  • I'm on page 4, and I can't even contain myself.

    Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.

    This is awesome, outside-of-the-box thinking tech.

    I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
    35
  • Could the Skyrim stuttering at 60hz w/ Gsync be because the engine operates internally at 64hz? All those Bethesda tech games drop 4 frames every second when vsync'd to 60hz which cause that severe microstutter you see on nearby floors and walls when moving and strafing. Same thing happened in Oblivion, Fallout 3, and New Vegas on PC. You had to use stutter removal mods in conjunction with the script extenders to actually force the game to operate at 60hz and smooth it out with vsync on.

    You mention it being smooth when set to 144hz with Gsync, is there any way you cap the display at 64hz and try it with Gsync alone (iPresentinterval=0) and see what happens then? Just wondering if the game is at fault here and if that specific issue is still there in their latest version of the engine.

    Alternatively I suppose you could load up Fallout 3 or NV instead and see if the Gsync results match Skyrim.
    3
  • I would be excited for this if it werent for Oculus Rift. I don't mean to be dismissive, this looks awesome...but it isn't Oculus Rift.
    6
  • Am I the only one who has never experienced screen tearing? Most of my games run past my refresh rate too....
    -1
  • i play alot of mmo and tearing apear all the time . this is nice tech i cant wait for 2560x1440. whit G-Sync
    1
  • TLM: Physx is not that way anymore. The newest gen of consoles can take advantage of Physx now and I'm sure will. That's hardly niche.
    -5
  • I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot, just like phys-x there are some people that care about it but most don`t even ecknowledge it, technically it can run on anything, practically only on nvidia and being a monitor side gimmick i would see this as a monitor company tech thing not graphics card maker closed stuff. I`m more interested in Mantle than this, since it preaches about better multicore cpu performance and better fps on your hardware.

    Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.

    G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!
    10
  • You guys should've tried Dead Island with G-Sync, the tearing in this game was really horrible.
    -1
  • I cant wait for G sync on 2560x1440
    -1
  • Needs to be open. If it's not in a standard, it won't take off. How many people want to rip open a display (likely voiding warranty) to install a pricey addon card, and then be strongly limited to what other equipment they can use it with.

    Get it standardised and into the DVI/HDMI/DP specs, then it'll take off.

    I wonder if you could just add a flag for variable vertical blanks, and have it send a 'starting next frame' sequence whenever a frame is rendered.

    If it's not included by default in monitors, it'll become the next PhysX. And to do that it has to be platform-agnostic.
    10
  • 640736 said:
    I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.


    oh really? I envy your eyes.
    2
  • 216536 said:
    I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot, just like phys-x there are some people that care about it but most don`t even ecknowledge it, technically it can run on anything, practically only on nvidia and being a monitor side gimmick i would see this as a monitor company tech thing not graphics card maker closed stuff. I`m more interested in Mantle than this, since it preaches about better multicore cpu performance and better fps on your hardware. Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor. G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!


    Considering mantle, what does GPU performance matter on a screen with input lag or screen with tearing, choppy and blurry video?

    Mantle will not solve this problem. Mantle is supposed to be more of a low-level common API with enhanced GPU performance as a possible advantage. I'm not sure that even compares to what's being discussed here. Maybe I'm way off???

    G-sync will eliminate input lag, tearing and blur and as a result add to the overall realism of the gaming experience.
    2
  • This will become more important as we migrate to 4k displays. At that resolution, maintaining very high frame rates will become more difficult. Allowing a better experience at lower frame rates will become more important and more valuable.
    2
  • 216536 said:
    I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot, just like phys-x there are some people that care about it but most don`t even ecknowledge it, technically it can run on anything, practically only on nvidia and being a monitor side gimmick i would see this as a monitor company tech thing not graphics card maker closed stuff. I`m more interested in Mantle than this, since it preaches about better multicore cpu performance and better fps on your hardware. Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor. G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!


    the monitor might be expensive right now but it will be good investment if you decide to go that route. at the very least you don't upgrade your monitor as often as gpu. my current monitor has been paired with GTS250, GTX460 and now GTX660 SLI. the only downside is it will locked you to use nvidia gpu only.
    -1