G-Sync Technology Preview: Quite Literally A Game Changer
You've forever faced this dilemma: disable V-sync and live with image tearing, or turn V-sync on and tolerate the annoying stutter and lag? Nvidia promises to make that question obsolete with a variable refresh rate technology we're previewing today.
60 Hz Panels, SLI, Surround, And Availability
Going Faster: Is This Tech Limited To High-Refresh Panels?
You'll notice that the first G-Sync-enabled monitor already supports a very high refresh rate (well beyond the technology's sweet spot) and a 1920x1080 resolution. But Asus' display has its own limitations, like a six-bit TN panel. We wanted to know whether Nvidia plans to limit G-Sync to high-refresh rate displays, or if we'd see it used on more common 60 Hz monitors. Also, the enthusiast in us wants access to 2560x1440 as soon as possible.
Nvidia reiterated for us G-Sync is best experienced when you're pushing your graphics card to frame rates between 30 and 60. As a result, the technology can really benefit conventional 60 Hz screens retrofitted with the G-Sync module.
So why start with 144 Hz? It sounds like a lot of the display vendors want to enable low motion blur functionality (3D LightBoost), which does require high refresh rates. But for those who're willing to leave that feature out (and why not, since it's not compatible with G-Sync right now anyway), it's possible to build a G-Sync-enabled panel for a lot less money.
As far as resolutions go, it sounds like QHD screens at refresh rates as high as 120 Hz will start showing up some time early next year.
Are There Any Issues Between SLI And G-Sync?
Nvidia's G-Sync FAQ clearly states that G-Sync is compatible with SLI; the graphics card attached to the display is the one that manages the variable refresh technology.
Now, the complete story requires a little more explanation. We've spent plenty of time discussing AMD and its frame pacing technology added to the Catalyst driver suite. Nvidia handles this through logic built into the Kepler architecture. The company says it'll be in Maxwell and beyond of course, but we're pretty sure we heard about it even prior to Kepler. At any rate, the same pacing technology that keeps frames displayed consistently with V-sync off in SLI is what you need for G-Sync to function properly. There's no additional work that needs to be done. Those frames are displayed from the "master" GPU, which also controls G-Sync.
What Does It Take To See G-Sync In Surround?
Now, obviously, the idea of slinging multiple cards together to output to a 1080p screen doesn't sound very necessary. Even a mid-range Kepler-based card should manage the frame rates needed to make that resolution playable. But it's also not possible to run a two-card SLI configuration with three G-Sync-capable displays in Surround, either.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
This is a limitation of the current display outputs on Nvidia's cards, which typically include two DVI ports, HDMI, and a DisplayPort connector. G-Sync requires DisplayPort 1.2, and an adapter won't work (neither will an MST hub). The only way to make Surround happen is with three cards, each attached to its own monitor. Of course, we presume that there's nothing stopping Nvidia's partners from coming out with a "G-Sync Edition" card sporting more DP connectivity.
G-Sync And Triple Buffering
Would you need triple-buffering enabled to get smooth performance out of G-Sync, similar to what you'd do with V-sync-on? The answer is no, not only does G-Sync not require triple-buffering because the pipeline is never stalled, but the use of triple buffering in G-Sync is detrimental; it adds an additional frame of latency with no performance benefit. Unfortunately, this is often set by games, so there's no way to manually override it.
What About Games That Typically Don't Respond Well To V-sync-Off?
Games like Skyrim, part of our usual benchmark suite, are intended to be run with V-sync enabled on a 60 Hz panel (although this drives some of us nuts due to the impact of input lag). Testing them requires modifying certain .ini files. So how does G-Sync behave with titles based on the Gamebryo and Creation engines, which are sensitive to V-sync settings? Does it cap out at 60 FPS?
That's a characteristic of the game, and G-Sync doesn't change it, just like running on a 120 or 144 Hz display with V-sync enabled wouldn't. Nvidia says that games like Skyrim should work fine with its technology, though, so long as they're limited to the frame rates the engine expects. In those cases, set your refresh to 60 Hz, turn on G-Sync, and the feature will conform to the correct maximum frame rate.
When Will This Stuff Be Available, And For How Much?
Currently, Nvidia expects its OEM partners to start shipping G-Sync-enabled displays in the first quarter of next year. The company says cost will be less of an issue than perhaps many enthusiasts expect, since the G-Sync module replaces the monitor's scaler. The pricing delta between those two components is the difference you'll see.
Hope You Have A Fast Mouse
As a final note, Nvidia makes it a special point to mention that you're best off with a fast mouse should you shift over to a G-Sync-based setup. A polling rate of 1000 Hz will help ensure your input device doesn't negatively affect reaction times.
Current page: 60 Hz Panels, SLI, Surround, And Availability
Prev Page 3D LightBoost, On-Board Memory, Standards, And 4K Next Page Getting G-Sync Working, And Our Test Setup-
gamerk316 I consider Gsync to be the most important gaming innovation since DX7. It's going to be one of those "How the HELL did we live without this before?" technologies.Reply -
monsta Totally agree, G Sync is really impressive and the technology we have been waiting for.Reply
What the hell is Mantle? -
wurkfur I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.Reply
I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.
At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider. -
expl0itfinder Competition, competition. Anybody who is flaming over who is better: AMD or nVidia, is clearly missing the point. With nVidia's G-Sync, and AMD's Mantle, we have, for the first time in a while, real market competition in the GPU space. What does that mean for consumers? Lower prices, better products.Reply -
This needs to be not so proprietary for it to become a game changer. As it is, requiring a specific GPU and specific monitor with an additional price premium just isn't compelling and won't reach a wide demographic.Reply
Is it great for those who already happen to fall within the requirements? Sure, but unless Nvidia opens this up or competitors make similar solutions, I feel like this is doomed to be as niche as lightboost, Physx, and, I suspect, Mantle. -
ubercake I'm on page 4, and I can't even contain myself.Reply
Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.
This is awesome, outside-of-the-box thinking tech.
I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
-
rickard Could the Skyrim stuttering at 60hz w/ Gsync be because the engine operates internally at 64hz? All those Bethesda tech games drop 4 frames every second when vsync'd to 60hz which cause that severe microstutter you see on nearby floors and walls when moving and strafing. Same thing happened in Oblivion, Fallout 3, and New Vegas on PC. You had to use stutter removal mods in conjunction with the script extenders to actually force the game to operate at 60hz and smooth it out with vsync on.Reply
You mention it being smooth when set to 144hz with Gsync, is there any way you cap the display at 64hz and try it with Gsync alone (iPresentinterval=0) and see what happens then? Just wondering if the game is at fault here and if that specific issue is still there in their latest version of the engine.
Alternatively I suppose you could load up Fallout 3 or NV instead and see if the Gsync results match Skyrim. -
Old_Fogie_Late_Bloomer I would be excited for this if it werent for Oculus Rift. I don't mean to be dismissive, this looks awesome...but it isn't Oculus Rift.Reply -
hysteria357 Am I the only one who has never experienced screen tearing? Most of my games run past my refresh rate too....Reply