Sign in with
Sign up | Sign in

Is G-Sync The Game-Changer You Didn’t Know You Were Waiting For?

G-Sync Technology Preview: Quite Literally A Game Changer
By , Chris Angelini

Before we even got into our hands-on testing of Asus' prototype G-Sync-capable monitor, we were glad to see Nvidia approaching a very real issue affecting PC gaming, for which no other solution had been proposed. Until now, your choices were V-sync on or V-sync off, each decision accompanied by compromises that detracted from the experience. When your line is "I run with V-sync disabled unless I can't take the tearing in a particular game, at which point I flip it on," then the decision sounds like picking the lesser of two evils.

G-Sync sets out to address that by giving the monitor an ability to scan at a variable refresh. Such innovations are the only way our industry can disruptively move forward and support the technical preeminence of PCs over consoles and other gaming platforms. No doubt, Nvidia is going to take heat for not pursuing a standard that competing vendors might be able to adopt. However, it's leveraging DisplayPort 1.2 for a solution we can go hands-on with today. As a result, two months after announcing G-Sync, here it is.

The real question becomes: is G-Sync everything Nvidia promised it'd be?

It's always hard to break past hype when you have three talented developers extolling the merits of a technology you haven't seen in action yet. But if your first experience with G-Sync is Nvidia's pendulum demo, you're going to wonder if such a severe and extreme difference is really possible, or if the test is somehow a specific scenario which is too good to be true.

Of course, as you shift over into real-world gaming, the impact is typically less binary. There are shades of "Whoa!" and "That's crazy" on one end of the spectrum and "I think I see the difference" on the other. The most splash happens when you switch from a 60 Hz display to something with a 144 Hz refresh and G-Sync enabled. But we also tried to test at 60 Hz with G-Sync to preview what you'll see from (hopefully) less expensive displays in the future. In certain cases, just the shift from 60 to 144 Hz is what will hit you as most effective, particularly if you can push those really high frame rates from a high-end graphics subsystem.

Today, we know that Asus plans to support G-Sync on its VG248QE, which the company told us will sell for $400 starting next year. That panel sports a native 1920x1080 resolution and refresh rates as high as 144 Hz. The non-G-Sync version won our Smart Buy award earlier this year for its exceptional performance. Personally, though, the 6-bit TN panel is an issue for me. I'm most looking forward to 2560x1440 and IPS. I'm even fine sticking with a 60 Hz refresh if it helps keep cost manageable.

While we expect a rash of announcements at CES, we have no official word from Nvidia as to when other displays sporting G-Sync modules will start shipping or how much they'll cost. We also aren't sure what the company's plans are for the previously-discussed upgrade module, which should let you take an existing Asus VG248QE and make it G-Sync ready "in about 20 minutes."

What we can say, though, is that the wait will be worth it. You'll see its influence unmistakeably in some games, and you'll notice it less in others. But the technology does effectively solve that age-old question of whether to keep V-sync enabled or not. 

Here's another interesting thought. Now that G-Sync is being tested, how long will AMD hold off on commenting? The company teased our readers in Tom's Hardware's AMA With AMD, In Its Entirety, mentioning it'd address the capability soon. Does it have something up its sleeve or not? Between the Mantle-enhanced version of Battlefield 4, Nvidia's upcoming Maxwell architecture, G-Sync, CrossFire powered by AMD's xDMA engine, and rumored upcoming dual-GPU boards, the end of 2013 and beginning of 2014 are bound to give us plenty of interesting news to talk about. Now, if we could just get more than 3 GB (Nvidia) and 4 GB (AMD) of GDDR5 on high-end cards that don't cost $1000...

React To This Article