A Brief History of Fixed Refresh Rates
A long time ago, PC monitors were big heavy items that contained curiously-named components like cathode ray tubes and electron guns. Back then, the electron guns shot at the screen to illuminate the colorful dots we call pixels. They did this one pixel at a time in a left-to-right scanning pattern for each line, working from the top to the bottom of the screen. Varying the electron guns' speed from one complete refresh to the next wasn't very practical, and there was no real need since 3D games were still decades away. So, CRTs and the associated analog video standards were designed with fixed refresh rates in mind.
LCDs eventually replaced CRTs, and digital connections (DVI, HDMI, and DisplayPort) replaced the analog ones (VGA). But the boards responsible for setting video signal standards (with VESA on top) haven't shifted away from those fixed refresh rates. Movies and television, after all, still rely on an input signal with a constant frame rate. Again, the need for a variable refresh didn’t seem so important.
Variable Frame Rates and Fixed Refresh Rates Don’t Match

Until the advent of advanced 3D graphics, a fixed refresh rate for displays was never an issue. However, an issue surfaced as we started getting our hands on powerful graphics processors: the rate at which GPUs render individual frames (what we refer to as their frame rate, commonly expressed in FPS or frames per second) isn’t constant. Rather it varies over time. A given card might be able to generate 30 frames per second in a particularly taxing scene and then 60 FPS moments later when you look up into an empty sky.
V-sync off makes you vulnerable to severe tearing on-screen
As it turns out, variable frame rates coming from a graphics card and fixed refresh rates on an LCD don't work particularly well together. In such a configuration, you end up with an on-screen artifact that we call tearing. This happens when two or more partial frames are rendered together during one monitor refresh cycle. They're typically misaligned, yielding a very distracting effect corresponding to motion.

The image above shows two well-known artifacts, which are commonly seen, but often difficult to document. Because these are display artifacts, they don't show up in regular screenshots taken in-game, but instead represent the image you actually experience. You need a fast camera to accurately capture and display them. Or if you have access to a capture card, which is what we use for our FCAT-based benchmarking, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to another. At the end of the day, though, the best way to see these effects is with your own eyes.
You can see the tearing effect in both images above, taken with a camera, and the one below, captured through a card. The picture is cut horizontally, and appears misaligned. In the first shot, we have a 60 Hz Sharp screen on the left and a 120 Hz Asus display on the right. Tearing at 120 Hz is naturally less pronounced since the refresh is twice as high. However, it's still noticeable in similar ways. This type of visual artifact is the clearest indicator that the pictures were taken with V-sync disabled.
Battlefield 4 on a GeForce GTX 770 with V-sync disabled
The other issue we see in the BioShock: Infinite comparison shot is called ghosting, and it's particularly apparent at the bottom of the left side. This is due to screen latency. To make a long story short, individual pixels don't change color quickly enough, leading to this type of afterglow. The in-game effect is far more dramatic than a still image can convey. A panel with 8 ms gray-to-gray response time, such as the Sharp, will appear blurry whenever there's fast movement on-screen. That's why those displays typically aren't recommended for first-person shooters.
V-sync: Trading One Problem For Another
Vertical synchonization, or V-sync, is a very old solution to the tearing problem. Enabling V-sync essentially tells the video card to try to match the screen's refresh, eliminating tearing entirely. The downside is that, if your video card cannot keep up and the frame rate dips below 60 FPS (on a 60 Hz display), effective FPS bounces back and forth among integer multiples of the screen's refresh rate (so, 60, 30, 20, 15 FPS, and so on), which in turn causes perceived stuttering.
When frame rate drops below refresh, you encounter stuttering with V-sync on
Furthermore, because it forces the video card to wait and sometimes relies on a third back buffer, V-sync can introduce additional input lag in the chain. Thus, V-sync can be both a blessing and a curse, trading one compromise for another set of compromises. An informal survey around the office suggests that most gamers keep V-sync off as a general rule, turning it on only when the tearing artifacts become unbearable.
Getting Creative: Nvidia Introduces G-Sync
With the launch of its GeForce GTX 680, Nvidia enabled a driver mode called Adaptive V-sync, which attempted to mitigate the issues with V-sync by turning it on at frame rates above the monitor's refresh rate, and then quickly switching it off if instantaneous performance dropped below the refresh rate. Although this technology did its job well, it was really more of a workaround and did not prevent tearing when the framerate dropped below the display's refresh.
The introduction of G-Sync is much more interesting. Nvidia is basically showing that, instead of forcing video cards to display games on monitors with a fixed refresh, we can make the latest screens work at variable rates.
The GPU's frame rate determines the monitor's refresh, eliminating the artifacts of V-sync on or off
DisplayPort’s packet-based data transfer mechanism provided a window of opportunity. By using variable blanking intervals in the DisplayPort video signal, and replacing a monitor scaler with a module that works with a variable blanking signal, an LCD can be driven at a variable refresh rate aligned to whichever frame rate the video card is putting out (up to the screen's refresh rate limit, of course). In practice, Nvidia is taking a creative approach in leveraging specific capabilities enabled by DisplayPort, taking the opportunity to kill two birds with one stone.
Even before we jump into the hands-on testing, we have to commend the creative approach to solving a very real problem affecting gaming on the PC. This is innovation at its finest. But how well does G-Sync work in practice?
Nvidia sent over an engineering sample of Asus' VG248QE with its scaler replaced by a G-Sync module. We're already plenty familiar with this specific display; we reviewed it in Asus VG248QE: A 24-Inch, 144 Hz Gaming Monitor Under $300 and it earned a prestigious Tom's Hardware Smart Buy award. Now it's time to preview how Nvidia's newest technology affects our favorite games.
As we were going through Nvidia's press material, we found ourselves asking a number of questions about G-Sync as a technology today, along with its role in the future. During a recent trip to the company's headquarters in Santa Clara, we were able to get some answers.
G-Sync And 3D LightBoost
The first thing we noticed was that Nvidia was sending out that Asus VG248QE monitor, modified to support G-Sync. That monitor also supports what Nvidia currently calls 3D LightBoost technology, which was originally introduced to improve brightness in 3D displays, but has long been unofficially used in 2D mode as well, using its panel-pulsing backlight to reduce the ghosting (or motion blur) artifact we mentioned on page one. Naturally, we wanted to know if that could be used with G-Sync.
Nvidia answered that no, although using both technologies at the same time is what you'd want ideally, today, strobing the backlight at a variable refresh currently results in flicker and brightness issues. Solving them is incredibly complex, since you have to adjust luminance and keep track of pulses. As a result, you currently have to choose between the two technologies, although the company is working on a way to use them together in the future.
The G-Sync Module's On-Board Memory
As we already know, G-Sync eliminates the incremental input lag associated with V-sync, since there's no longer a need to wait for the panel to scan. However, we noticed that the G-Sync module has on-board memory. Could the module be buffering frames itself? If so, how much time would it take for a frame to make its way through the new pipeline?
According to Nvidia, frames are never buffered in the module's memory. As data comes in, it's displayed on the screen, and the memory does perform several other roles. However, the processing time related to G-Sync is way less than one millisecond. In fact, this is roughly the same latency encountered with V-sync off, and is related to the game, graphics driver, the mouse, and so on.
Will G-Sync Ever Be Standardized?
This came up during a recent AMA with AMD when a reader wanted to get the company's reaction to G-Sync. However, we also wanted to follow up with Nvidia directly to see if the company had any plans to push its technology as an industry standard. In theory, it could propose G-Sync as an update to the DisplayPort standard, exposing variable refresh rates. Nvidia is a member of VESA, the industry's main standard-setting board, after all.
Simply, there are no plans to introduce a new spec to DisplayPort, HDMI, or DVI. G-Sync is already working with DisplayPort 1.2, meaning there's no need for a standards change.
As mentioned, Nvidia is working to make G-Sync compatible with what it currently refers to as 3D LightBoost (and will be called something else soon). It's also trying to bring the module's cost down to make G-Sync more accessible.
G-Sync At Ultra HD Resolutions
Nvidia's online FAQ promises G-Sync-capable monitors with resolutions as high as 3840x2160. However, the Asus model we're previewing today maxes out at 1920x1080. Currently, the only Ultra HD monitors employ STMicro's Athena controller, which uses two scalers to create a tiled display. We were curious, then: does the G-Sync module support an MST configuration?
In truth, we'll be waiting a while before 4K displays show up with variable refresh rates. Today, there is no single scaler able to support 4K resolutions, and the soonest that's expected to arrive is Q1 of 2014, with monitors including it showing up in Q2. Because the G-Sync module replaces the scaler, compatible panels would only start surfacing sometime after that point. Fortunately, the module does natively support Ultra HD.
Keeping Up: What Happens Under 30 Hz?
The variable refresh enabled by G-Sync works down to 30 Hz. The explanation for this is that, at very low refresh rates, an image on an LCD starts to decay after a while and you end up with visual artifacts. If your source drops below 30 FPS, the module knows to refresh the panel automatically, avoiding those issues. That might mean displaying the same image more than once, but the threshold is set to 30 Hz to maintain the best-looking experience possible.
Going Faster: Is This Tech Limited To High-Refresh Panels?
You'll notice that the first G-Sync-enabled monitor already supports a very high refresh rate (well beyond the technology's sweet spot) and a 1920x1080 resolution. But Asus' display has its own limitations, like a six-bit TN panel. We wanted to know whether Nvidia plans to limit G-Sync to high-refresh rate displays, or if we'd see it used on more common 60 Hz monitors. Also, the enthusiast in us wants access to 2560x1440 as soon as possible.
Nvidia reiterated for us G-Sync is best experienced when you're pushing your graphics card to frame rates between 30 and 60. As a result, the technology can really benefit conventional 60 Hz screens retrofitted with the G-Sync module.
So why start with 144 Hz? It sounds like a lot of the display vendors want to enable low motion blur functionality (3D LightBoost), which does require high refresh rates. But for those who're willing to leave that feature out (and why not, since it's not compatible with G-Sync right now anyway), it's possible to build a G-Sync-enabled panel for a lot less money.
As far as resolutions go, it sounds like QHD screens at refresh rates as high as 120 Hz will start showing up some time early next year.
Are There Any Issues Between SLI And G-Sync?
Nvidia's G-Sync FAQ clearly states that G-Sync is compatible with SLI; the graphics card attached to the display is the one that manages the variable refresh technology.
Now, the complete story requires a little more explanation. We've spent plenty of time discussing AMD and its frame pacing technology added to the Catalyst driver suite. Nvidia handles this through logic built into the Kepler architecture. The company says it'll be in Maxwell and beyond of course, but we're pretty sure we heard about it even prior to Kepler. At any rate, the same pacing technology that keeps frames displayed consistently with V-sync off in SLI is what you need for G-Sync to function properly. There's no additional work that needs to be done. Those frames are displayed from the "master" GPU, which also controls G-Sync.
What Does It Take To See G-Sync In Surround?
Now, obviously, the idea of slinging multiple cards together to output to a 1080p screen doesn't sound very necessary. Even a mid-range Kepler-based card should manage the frame rates needed to make that resolution playable. But it's also not possible to run a two-card SLI configuration with three G-Sync-capable displays in Surround, either.
This is a limitation of the current display outputs on Nvidia's cards, which typically include two DVI ports, HDMI, and a DisplayPort connector. G-Sync requires DisplayPort 1.2, and an adapter won't work (neither will an MST hub). The only way to make Surround happen is with three cards, each attached to its own monitor. Of course, we presume that there's nothing stopping Nvidia's partners from coming out with a "G-Sync Edition" card sporting more DP connectivity.
G-Sync And Triple Buffering
Would you need triple-buffering enabled to get smooth performance out of G-Sync, similar to what you'd do with V-sync-on? The answer is no, not only does G-Sync not require triple-buffering because the pipeline is never stalled, but the use of triple buffering in G-Sync is detrimental; it adds an additional frame of latency with no performance benefit. Unfortunately, this is often set by games, so there's no way to manually override it.
What About Games That Typically Don't Respond Well To V-sync-Off?
Games like Skyrim, part of our usual benchmark suite, are intended to be run with V-sync enabled on a 60 Hz panel (although this drives some of us nuts due to the impact of input lag). Testing them requires modifying certain .ini files. So how does G-Sync behave with titles based on the Gamebryo and Creation engines, which are sensitive to V-sync settings? Does it cap out at 60 FPS?
That's a characteristic of the game, and G-Sync doesn't change it, just like running on a 120 or 144 Hz display with V-sync enabled wouldn't. Nvidia says that games like Skyrim should work fine with its technology, though, so long as they're limited to the frame rates the engine expects. In those cases, set your refresh to 60 Hz, turn on G-Sync, and the feature will conform to the correct maximum frame rate.
When Will This Stuff Be Available, And For How Much?
Currently, Nvidia expects its OEM partners to start shipping G-Sync-enabled displays in the first quarter of next year. The company says cost will be less of an issue than perhaps many enthusiasts expect, since the G-Sync module replaces the monitor's scaler. The pricing delta between those two components is the difference you'll see.
Hope You Have A Fast Mouse
As a final note, Nvidia makes it a special point to mention that you're best off with a fast mouse should you shift over to a G-Sync-based setup. A polling rate of 1000 Hz will help ensure your input device doesn't negatively affect reaction times.
System requirements
G-Sync isn't part of any existing standard, nor does Nvidia anticipate trying to get it included with future versions of DisplayPort. As such, there are some specific requirements that need to be satisfied before you can expect that G-Sync-capable monitor you have your eye on to work properly.
First, you need an Nvidia graphics card. Specifically, it needs to be a GeForce GTX 650 Ti Boost or faster model. Kepler is the first graphics architecture with an integrated display controller that can be programmed to enable G-Sync, so even if you have a Fermi-based GPU that's faster, the technology won't work. Maxwell was designed specifically to support it, so upcoming cards will feature G-Sync as well.
The second requirement is a monitor with Nvidia's G-Sync module built-in. This module replaces the screen's scalar. So, it's not possible to add G-Sync to a tiled Ultra HD display, for example. In today's story, we're using a prototype capable of 1920x1080 at up to 144 Hz. But you can imagine just how much more impact G-Sync will have if manufacturers start adding it to less expensive 60 Hz panels.
Third, you need to be using a DisplayPort 1.2 cable. DVI and HDMI connections are not supported. In the near-term, this means that the only way G-Sync is going to work across multi-display Surround arrays is via a three-way SLI configuration, since each card has at most a single DisplayPort connection and adapting from a card's DVI output to DisplayPort won't work. Similarly, an MST hub won't do the trick.
Finally, driver support is required. The latest 331.93 beta software enables G-Sync, and we assume future WHQL-certified releases will include it as well.
Test Setup
| Test Hardware | |
|---|---|
| Processors | Intel Core i7-3970X (Sandy Bridge-E) 3.5 GHz Base Clock Rate, Overclocked to 4.3 GHz, LGA 2011, 15 MB Shared L3, Hyper-Threading enabled, Power-savings enabled |
| Motherboard | MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5 |
| Memory | G.Skill 32 GB (8 x 4 GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 and 1.65 V |
| Hard Drive | Samsung 840 Pro SSD 256 GB SATA 6Gb/s |
| Graphics | Nvidia GeForce GTX 780 Ti 3 GB |
| Nvidia GeForce GTX 760 2 GB | |
| Power Supply | Corsair AX860i 860 W |
| System Software And Drivers | |
| Operating System | Windows 8 Professional 64-bit |
| DirectX | DirectX 11 |
| Graphics Driver | Nvidia GeForce 331.93 Beta |
Now, it's important to understand where G-Sync does and does not yield the most significant impact. There's a good chance you're currently using a screen that operates at 60 Hz. Faster 120 and 144 Hz refresh rates are popular amongst gamers, but Nvidia is (rightly) predicting that its biggest market will be the enthusiasts still stuck at 60 Hz.
With V-sync turned on at 60 Hz, the most visually-disturbing artifacts are encountered when 60 FPS cannot be maintained, yielding those jarring jumps between 30 and 60 FPS. That's where you see significant stuttering. With V-sync turned off, scenes with a lot of motion or panning side to side make tearing most apparent. For some enthusiasts, this detracts so much from the game that they simply turn V-sync on and live with the stuttering and incurred input lag.
As you step up to 120 and 144 Hz and higher frame rates, the display refreshes itself more often, cutting down on the amount of time spent with one frame persisting for multiple scans if performance cannot keep up. However, the same issues with V-sync on and off do persist. For this reason, we'll be hands-on testing the Asus monitor in 60 and 144 MHz mode, with and without G-Sync enabled.
So now it's time to put G-Sync to the test. Bust out the video capture card, multi-SSD array, and get to benchmarking, right?
Wrong.
This isn't a performance story. It's a quality one. In this case, benchmarking only tells us one thing: where the frame rate is at any given point. What it doesn't tell us is how good the experience with or without G-Sync really is. And so we have to rely on carefully-written and eloquently delivered words. I'll try to make it as painless as possible.
Why not simply record video and let you watch for yourself? A camera records at a fixed 60 Hz. So too does your monitor play back at a constant 60 Hz. Because G-Sync is variable, you won't see the technology in action.
Given enough games, there is a seemingly endless number of permutations we could run. V-sync on, V-sync off, G-Sync on, G-Sync off, 60 Hz, 120 Hz, 144 Hz...the list goes on and on. But we'll start by setting our prototype screen to a 60 Hz refresh rate and gaming with V-sync enabled.
Perhaps the easiest place to start is Nvidia's own demo tool, a pendulum that swings back and forth. It can be set to a simulated 60, 50, or 40 FPS. Or, you can vacillate between 40 and 60. With the picture at any of those settings, you toggle between no V-sync, V-sync enabled, and G-Sync. Contrived though this test may be, it's probably the most dramatic example of the technology possible. You can watch the scene at 50 FPS with V-sync on and think, "Yeah, that's not too bad; I see what appears to be a stutter, but I could live with then." Then, G-Sync is switched on and you slap yourself. "What was I thinking? That's a night-and-day difference. How could I have been alright with that before?"
But then you pinch yourself and remember that this is a tech demo. You want your evidence steeped in real-world gaming. So you fire up something you know is going to be taxing, like Arma III.

In Arma III, I can drop a GeForce GTX 770 into my test machine and dial in the Ultra detail preset. With V-sync off, that's good for frame rates in the 40s or 50s. Turn V-sync on, though, and you're forced down to 30 FPS. Performance isn't good enough that you see lots of fluctuation between 30 and 60 FPS. Instead, the card's frame rate is just neutered.
Because there wasn't any real stuttering before, what you see on-screen with G-Sync enabled isn't significantly different, except that practical performance jumps between 10 and 20 FPS higher. Input lag would be expected to decrease as well, since you no longer have the same frames displayed for multiple monitor scans. I find Arma to be less twitchy than a lot of other games though, so I didn't feel much latency.
Metro: Last Light, on the other hand, makes G-Sync more apparent. Running on a GeForce GTX 770, the game can be set to 1920x1080 at Very High details with 16x AF, normal tessellation, and normal motion blur. From there, you can tweak the SSAA setting from 1x to 2x to 3x to erode away the frame rate incrementally.
It also helps that where the game starts you off includes a hallway that's easy to strafe up and down. I fired the level up with V-sync on at 60 Hz and went to town. Fraps reported frame rates down below 30 FPS with SSAA at 3x, and up closer to 60 FPS with SSAA off. Both the stutter and lag are significant. Disable SSAA and you walk around with complete fluidity pinned at 60. Switch to 2x SSAA, though, and the variation from 60 to 30 FPS makes each duplicated frame a painful jerk. This is one of those games I'd set to V-sync off, and simply ignore the tearing. I've been doing that for years; it's something I'm just used to.
But G-Sync makes all of that pain go away. You don't find yourself staring at the Fraps counter, looking for evidence of a dip below 60 FPS that'd compel you to turn off another detail setting. Rather, you can turn then up higher because, even if you do dip to 50 or even 40 FPS, you don't end up with any of that unnatural stutter. And the previous solution, disabling V-sync, well, we'll get to that on the next page.
I'm basing this on an unofficial poll of Tom's Hardware writers and friends easily accessible to me over Skype (in other words, the sample size is small), but most everyone who understands what V-sync is and what it compromises appears to turn it off. The only time they go back is when running with V-sync disabled is deemed unbearable due to the tearing you experience when frames coming from your GPU don't match the panel's refresh cycle.
As you might imagine, then, the visual impact of running with V-sync disabled is unmistakeable, though also largely affected by the game you're playing and the detail settings you use.

Take Crysis 3, for example. It's easy to really hammer your graphics subsystem using the taxing Very High preset. And because Crysis is a first-person shooter involving plenty of fast motion, the tears you see can be quite substantial. In the example above, output from FCAT is captured between two frames, and you see branches of the tree completely disjointed.

On the other hand, when we force V-sync off in Skyrim, the tearing isn't nearly as bad. Consider that our frame rate is insanely high, and that multiple frames are showing up on-screen per display scan. Thus, the amount of motion per frame is relatively low. There are still issues with playing Skyrim like this, so it's probably not the optimal configuration. But it just goes to show that even running with V-sync turned off yields a varying experience.
This is just a third example in Tomb Raider, where Lara's shoulder is pretty severely misaligned (also look at her hair and tank top strap). Incidentally, Tomb Raider is one of the only games in our suite that lets you choose between double- and triple-buffering if you use V-sync.

A final chart shows that running Metro: Last Light with G-Sync enabled at 144 Hz basically gives you the same performance as running the game with V-sync turned off. The part you can't see is that there is no tearing. Using the technology on a 60 Hz screen caps you out at 60 FPS, though there is no stuttering or input lag.
At any rate, for those of you (and us) who've spent countless hours watching the same benchmark sequences over and over, this is what we're used to. This is how we measure the absolute performance of graphics cards. So it can be a little jarring to watch the same passages with G-Sync turned on, yielding the fluidity of V-sync enabled without the tearing that accompanies V-sync turned off. Again, I wish it was something I could show you with a video clip, but I'm working on a way to host another event in Bakersfield to allow readers to try G-Sync out for themselves, blindly, to gather more dynamic reactions.
Going Hands-On With More Games
I tried testing several other titles. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 all received at least some time on the bench. All of them except Skyrim saw some benefit from G-Sync. The impact varies by title, but once you see it, you cannot ignore what was going on previously that you were subconsciously disregarding.
There can be artifacts. For example, the crawling attributed to aliasing is more distracting when motion is smooth. So you end up really wanting as much AA as you can get to keep your eyes from being drawn to jaggies that weren't as bothersome before.
Skyrim: A Special Case
As for Skyrim, the Creation engine is designed with V-sync enabled by default. It takes a special iPresentInterval=0 line added to one of the game's .ini files in order for us to benchmark it above 60 FPS.
So, there are three ways to try testing Skyrim: in its default state, leaving Nvidia's driver at "Use the 3D application setting", forcing G-Sync on in the driver and leaving Skyrim alone, and then forcing G-Sync on and disabling V-sync through Skyrim's .ini.

With the prototype monitor set to 60 Hz, the first configuration predictably yielded a flat 60 FPS at Ultra settings using a GeForce GTX 770. Consequently, motion is nice and smooth. However, user input is still hampered by an obnoxious amount of lag. Moreover, strafing from side to side reveals lots of motion blur. This is the way almost everyone plays the game on PCs, though. You can step the screen up to 144 Hz of course, and that really cleans up the motion blur. But because the GTX 770 sits between 90 and 100 FPS, you end up with palpable stuttering as the engine jumps between 144 and 72 FPS.
At 60 Hz, adding G-Sync to the equation actually has a detrimental effect, likely because V-sync is forced on and the technology is meant to operate with V-sync off. Now, strafing (particularly up close to walls) leads to fairly severe stuttering. That's going to be a problem on 60 Hz G-Sync-capable panels, at least in games like Skyrim. Fortunately, as it pertains to Asus' VG248Q, you can switch to 144 Hz and, despite V-sync still being on, G-Sync appears to function at those high frame rates without stutter.
Completely shutting off V-sync makes mouse control so much snappier in Skyrim. However, you do end up with a bunch of tearing (not to mention other artifacts like shimmering water). Turning on G-Sync still leaves you with stutters at 60 Hz, which smooth out at 144 Hz. Although we do all of our testing with V-sync turned off for our graphics card reviews, I wouldn't recommend playing this game without it.
For Skyrim, turning G-Sync off and playing at 60 Hz is probably the most natural approach, providing you get more than 60 FPS all of the time using your desired quality settings (not difficult).
Before we even got into our hands-on testing of Asus' prototype G-Sync-capable monitor, we were glad to see Nvidia approaching a very real issue affecting PC gaming, for which no other solution had been proposed. Until now, your choices were V-sync on or V-sync off, each decision accompanied by compromises that detracted from the experience. When your line is "I run with V-sync disabled unless I can't take the tearing in a particular game, at which point I flip it on," then the decision sounds like picking the lesser of two evils.
G-Sync sets out to address that by giving the monitor an ability to scan at a variable refresh. Such innovations are the only way our industry can disruptively move forward and support the technical preeminence of PCs over consoles and other gaming platforms. No doubt, Nvidia is going to take heat for not pursuing a standard that competing vendors might be able to adopt. However, it's leveraging DisplayPort 1.2 for a solution we can go hands-on with today. As a result, two months after announcing G-Sync, here it is.
The real question becomes: is G-Sync everything Nvidia promised it'd be?
It's always hard to break past hype when you have three talented developers extolling the merits of a technology you haven't seen in action yet. But if your first experience with G-Sync is Nvidia's pendulum demo, you're going to wonder if such a severe and extreme difference is really possible, or if the test is somehow a specific scenario which is too good to be true.
Of course, as you shift over into real-world gaming, the impact is typically less binary. There are shades of "Whoa!" and "That's crazy" on one end of the spectrum and "I think I see the difference" on the other. The most splash happens when you switch from a 60 Hz display to something with a 144 Hz refresh and G-Sync enabled. But we also tried to test at 60 Hz with G-Sync to preview what you'll see from (hopefully) less expensive displays in the future. In certain cases, just the shift from 60 to 144 Hz is what will hit you as most effective, particularly if you can push those really high frame rates from a high-end graphics subsystem.
Today, we know that Asus plans to support G-Sync on its VG248QE, which the company told us will sell for $400 starting next year. That panel sports a native 1920x1080 resolution and refresh rates as high as 144 Hz. The non-G-Sync version won our Smart Buy award earlier this year for its exceptional performance. Personally, though, the 6-bit TN panel is an issue for me. I'm most looking forward to 2560x1440 and IPS. I'm even fine sticking with a 60 Hz refresh if it helps keep cost manageable.
While we expect a rash of announcements at CES, we have no official word from Nvidia as to when other displays sporting G-Sync modules will start shipping or how much they'll cost. We also aren't sure what the company's plans are for the previously-discussed upgrade module, which should let you take an existing Asus VG248QE and make it G-Sync ready "in about 20 minutes."
What we can say, though, is that the wait will be worth it. You'll see its influence unmistakeably in some games, and you'll notice it less in others. But the technology does effectively solve that age-old question of whether to keep V-sync enabled or not.
Here's another interesting thought. Now that G-Sync is being tested, how long will AMD hold off on commenting? The company teased our readers in Tom's Hardware's AMA With AMD, In Its Entirety, mentioning it'd address the capability soon. Does it have something up its sleeve or not? Between the Mantle-enhanced version of Battlefield 4, Nvidia's upcoming Maxwell architecture, G-Sync, CrossFire powered by AMD's xDMA engine, and rumored upcoming dual-GPU boards, the end of 2013 and beginning of 2014 are bound to give us plenty of interesting news to talk about. Now, if we could just get more than 3 GB (Nvidia) and 4 GB (AMD) of GDDR5 on high-end cards that don't cost $1000...





