G-Sync Technology Preview: Quite Literally A Game Changer
You've forever faced this dilemma: disable V-sync and live with image tearing, or turn V-sync on and tolerate the annoying stutter and lag? Nvidia promises to make that question obsolete with a variable refresh rate technology we're previewing today.
Testing G-Sync Against V-Sync Enabled
So now it's time to put G-Sync to the test. Bust out the video capture card, multi-SSD array, and get to benchmarking, right?
Wrong.
This isn't a performance story. It's a quality one. In this case, benchmarking only tells us one thing: where the frame rate is at any given point. What it doesn't tell us is how good the experience with or without G-Sync really is. And so we have to rely on carefully-written and eloquently delivered words. I'll try to make it as painless as possible.
Why not simply record video and let you watch for yourself? A camera records at a fixed 60 Hz. So too does your monitor play back at a constant 60 Hz. Because G-Sync is variable, you won't see the technology in action.
Given enough games, there is a seemingly endless number of permutations we could run. V-sync on, V-sync off, G-Sync on, G-Sync off, 60 Hz, 120 Hz, 144 Hz...the list goes on and on. But we'll start by setting our prototype screen to a 60 Hz refresh rate and gaming with V-sync enabled.
Perhaps the easiest place to start is Nvidia's own demo tool, a pendulum that swings back and forth. It can be set to a simulated 60, 50, or 40 FPS. Or, you can vacillate between 40 and 60. With the picture at any of those settings, you toggle between no V-sync, V-sync enabled, and G-Sync. Contrived though this test may be, it's probably the most dramatic example of the technology possible. You can watch the scene at 50 FPS with V-sync on and think, "Yeah, that's not too bad; I see what appears to be a stutter, but I could live with then." Then, G-Sync is switched on and you slap yourself. "What was I thinking? That's a night-and-day difference. How could I have been alright with that before?"
But then you pinch yourself and remember that this is a tech demo. You want your evidence steeped in real-world gaming. So you fire up something you know is going to be taxing, like Arma III.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
In Arma III, I can drop a GeForce GTX 770 into my test machine and dial in the Ultra detail preset. With V-sync off, that's good for frame rates in the 40s or 50s. Turn V-sync on, though, and you're forced down to 30 FPS. Performance isn't good enough that you see lots of fluctuation between 30 and 60 FPS. Instead, the card's frame rate is just neutered.
Because there wasn't any real stuttering before, what you see on-screen with G-Sync enabled isn't significantly different, except that practical performance jumps between 10 and 20 FPS higher. Input lag would be expected to decrease as well, since you no longer have the same frames displayed for multiple monitor scans. I find Arma to be less twitchy than a lot of other games though, so I didn't feel much latency.
Metro: Last Light, on the other hand, makes G-Sync more apparent. Running on a GeForce GTX 770, the game can be set to 1920x1080 at Very High details with 16x AF, normal tessellation, and normal motion blur. From there, you can tweak the SSAA setting from 1x to 2x to 3x to erode away the frame rate incrementally.
It also helps that where the game starts you off includes a hallway that's easy to strafe up and down. I fired the level up with V-sync on at 60 Hz and went to town. Fraps reported frame rates down below 30 FPS with SSAA at 3x, and up closer to 60 FPS with SSAA off. Both the stutter and lag are significant. Disable SSAA and you walk around with complete fluidity pinned at 60. Switch to 2x SSAA, though, and the variation from 60 to 30 FPS makes each duplicated frame a painful jerk. This is one of those games I'd set to V-sync off, and simply ignore the tearing. I've been doing that for years; it's something I'm just used to.
But G-Sync makes all of that pain go away. You don't find yourself staring at the Fraps counter, looking for evidence of a dip below 60 FPS that'd compel you to turn off another detail setting. Rather, you can turn then up higher because, even if you do dip to 50 or even 40 FPS, you don't end up with any of that unnatural stutter. And the previous solution, disabling V-sync, well, we'll get to that on the next page.
Current page: Testing G-Sync Against V-Sync Enabled
Prev Page Getting G-Sync Working, And Our Test Setup Next Page Testing G-Sync Against V-Sync Disabled-
gamerk316 I consider Gsync to be the most important gaming innovation since DX7. It's going to be one of those "How the HELL did we live without this before?" technologies.Reply -
monsta Totally agree, G Sync is really impressive and the technology we have been waiting for.Reply
What the hell is Mantle? -
wurkfur I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.Reply
I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.
At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider. -
expl0itfinder Competition, competition. Anybody who is flaming over who is better: AMD or nVidia, is clearly missing the point. With nVidia's G-Sync, and AMD's Mantle, we have, for the first time in a while, real market competition in the GPU space. What does that mean for consumers? Lower prices, better products.Reply -
This needs to be not so proprietary for it to become a game changer. As it is, requiring a specific GPU and specific monitor with an additional price premium just isn't compelling and won't reach a wide demographic.Reply
Is it great for those who already happen to fall within the requirements? Sure, but unless Nvidia opens this up or competitors make similar solutions, I feel like this is doomed to be as niche as lightboost, Physx, and, I suspect, Mantle. -
ubercake I'm on page 4, and I can't even contain myself.Reply
Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.
This is awesome, outside-of-the-box thinking tech.
I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
-
rickard Could the Skyrim stuttering at 60hz w/ Gsync be because the engine operates internally at 64hz? All those Bethesda tech games drop 4 frames every second when vsync'd to 60hz which cause that severe microstutter you see on nearby floors and walls when moving and strafing. Same thing happened in Oblivion, Fallout 3, and New Vegas on PC. You had to use stutter removal mods in conjunction with the script extenders to actually force the game to operate at 60hz and smooth it out with vsync on.Reply
You mention it being smooth when set to 144hz with Gsync, is there any way you cap the display at 64hz and try it with Gsync alone (iPresentinterval=0) and see what happens then? Just wondering if the game is at fault here and if that specific issue is still there in their latest version of the engine.
Alternatively I suppose you could load up Fallout 3 or NV instead and see if the Gsync results match Skyrim. -
Old_Fogie_Late_Bloomer I would be excited for this if it werent for Oculus Rift. I don't mean to be dismissive, this looks awesome...but it isn't Oculus Rift.Reply -
hysteria357 Am I the only one who has never experienced screen tearing? Most of my games run past my refresh rate too....Reply