Testing G-Sync Against V-Sync Enabled
So now it's time to put G-Sync to the test. Bust out the video capture card, multi-SSD array, and get to benchmarking, right?
This isn't a performance story. It's a quality one. In this case, benchmarking only tells us one thing: where the frame rate is at any given point. What it doesn't tell us is how good the experience with or without G-Sync really is. And so we have to rely on carefully-written and eloquently delivered words. I'll try to make it as painless as possible.
Why not simply record video and let you watch for yourself? A camera records at a fixed 60 Hz. So too does your monitor play back at a constant 60 Hz. Because G-Sync is variable, you won't see the technology in action.
Given enough games, there is a seemingly endless number of permutations we could run. V-sync on, V-sync off, G-Sync on, G-Sync off, 60 Hz, 120 Hz, 144 Hz...the list goes on and on. But we'll start by setting our prototype screen to a 60 Hz refresh rate and gaming with V-sync enabled.
Perhaps the easiest place to start is Nvidia's own demo tool, a pendulum that swings back and forth. It can be set to a simulated 60, 50, or 40 FPS. Or, you can vacillate between 40 and 60. With the picture at any of those settings, you toggle between no V-sync, V-sync enabled, and G-Sync. Contrived though this test may be, it's probably the most dramatic example of the technology possible. You can watch the scene at 50 FPS with V-sync on and think, "Yeah, that's not too bad; I see what appears to be a stutter, but I could live with then." Then, G-Sync is switched on and you slap yourself. "What was I thinking? That's a night-and-day difference. How could I have been alright with that before?"
But then you pinch yourself and remember that this is a tech demo. You want your evidence steeped in real-world gaming. So you fire up something you know is going to be taxing, like Arma III.
In Arma III, I can drop a GeForce GTX 770 into my test machine and dial in the Ultra detail preset. With V-sync off, that's good for frame rates in the 40s or 50s. Turn V-sync on, though, and you're forced down to 30 FPS. Performance isn't good enough that you see lots of fluctuation between 30 and 60 FPS. Instead, the card's frame rate is just neutered.
Because there wasn't any real stuttering before, what you see on-screen with G-Sync enabled isn't significantly different, except that practical performance jumps between 10 and 20 FPS higher. Input lag would be expected to decrease as well, since you no longer have the same frames displayed for multiple monitor scans. I find Arma to be less twitchy than a lot of other games though, so I didn't feel much latency.
Metro: Last Light, on the other hand, makes G-Sync more apparent. Running on a GeForce GTX 770, the game can be set to 1920x1080 at Very High details with 16x AF, normal tessellation, and normal motion blur. From there, you can tweak the SSAA setting from 1x to 2x to 3x to erode away the frame rate incrementally.
It also helps that where the game starts you off includes a hallway that's easy to strafe up and down. I fired the level up with V-sync on at 60 Hz and went to town. Fraps reported frame rates down below 30 FPS with SSAA at 3x, and up closer to 60 FPS with SSAA off. Both the stutter and lag are significant. Disable SSAA and you walk around with complete fluidity pinned at 60. Switch to 2x SSAA, though, and the variation from 60 to 30 FPS makes each duplicated frame a painful jerk. This is one of those games I'd set to V-sync off, and simply ignore the tearing. I've been doing that for years; it's something I'm just used to.
But G-Sync makes all of that pain go away. You don't find yourself staring at the Fraps counter, looking for evidence of a dip below 60 FPS that'd compel you to turn off another detail setting. Rather, you can turn then up higher because, even if you do dip to 50 or even 40 FPS, you don't end up with any of that unnatural stutter. And the previous solution, disabling V-sync, well, we'll get to that on the next page.