System requirements
G-Sync isn't part of any existing standard, nor does Nvidia anticipate trying to get it included with future versions of DisplayPort. As such, there are some specific requirements that need to be satisfied before you can expect that G-Sync-capable monitor you have your eye on to work properly.
First, you need an Nvidia graphics card. Specifically, it needs to be a GeForce GTX 650 Ti Boost or faster model. Kepler is the first graphics architecture with an integrated display controller that can be programmed to enable G-Sync, so even if you have a Fermi-based GPU that's faster, the technology won't work. Maxwell was designed specifically to support it, so upcoming cards will feature G-Sync as well.
The second requirement is a monitor with Nvidia's G-Sync module built-in. This module replaces the screen's scalar. So, it's not possible to add G-Sync to a tiled Ultra HD display, for example. In today's story, we're using a prototype capable of 1920x1080 at up to 144 Hz. But you can imagine just how much more impact G-Sync will have if manufacturers start adding it to less expensive 60 Hz panels.
Third, you need to be using a DisplayPort 1.2 cable. DVI and HDMI connections are not supported. In the near-term, this means that the only way G-Sync is going to work across multi-display Surround arrays is via a three-way SLI configuration, since each card has at most a single DisplayPort connection and adapting from a card's DVI output to DisplayPort won't work. Similarly, an MST hub won't do the trick.
Finally, driver support is required. The latest 331.93 beta software enables G-Sync, and we assume future WHQL-certified releases will include it as well.
Test Setup
| Test Hardware | |
|---|---|
| Processors | Intel Core i7-3970X (Sandy Bridge-E) 3.5 GHz Base Clock Rate, Overclocked to 4.3 GHz, LGA 2011, 15 MB Shared L3, Hyper-Threading enabled, Power-savings enabled |
| Motherboard | MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5 |
| Memory | G.Skill 32 GB (8 x 4 GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 and 1.65 V |
| Hard Drive | Samsung 840 Pro SSD 256 GB SATA 6Gb/s |
| Graphics | Nvidia GeForce GTX 780 Ti 3 GB |
| Nvidia GeForce GTX 760 2 GB | |
| Power Supply | Corsair AX860i 860 W |
| System Software And Drivers | |
| Operating System | Windows 8 Professional 64-bit |
| DirectX | DirectX 11 |
| Graphics Driver | Nvidia GeForce 331.93 Beta |
Now, it's important to understand where G-Sync does and does not yield the most significant impact. There's a good chance you're currently using a screen that operates at 60 Hz. Faster 120 and 144 Hz refresh rates are popular amongst gamers, but Nvidia is (rightly) predicting that its biggest market will be the enthusiasts still stuck at 60 Hz.
With V-sync turned on at 60 Hz, the most visually-disturbing artifacts are encountered when 60 FPS cannot be maintained, yielding those jarring jumps between 30 and 60 FPS. That's where you see significant stuttering. With V-sync turned off, scenes with a lot of motion or panning side to side make tearing most apparent. For some enthusiasts, this detracts so much from the game that they simply turn V-sync on and live with the stuttering and incurred input lag.
As you step up to 120 and 144 Hz and higher frame rates, the display refreshes itself more often, cutting down on the amount of time spent with one frame persisting for multiple scans if performance cannot keep up. However, the same issues with V-sync on and off do persist. For this reason, we'll be hands-on testing the Asus monitor in 60 and 144 MHz mode, with and without G-Sync enabled.
- To Synchronize Or Not To Synchronize, That Is (No Longer) The Question
- 3D LightBoost, On-Board Memory, Standards, And 4K
- 60 Hz Panels, SLI, Surround, And Availability
- Getting G-Sync Working, And Our Test Setup
- Testing G-Sync Against V-Sync Enabled
- Testing G-Sync Against V-Sync Disabled
- Game Compatibility: Mostly Great
- Is G-Sync The Game-Changer You Didn’t Know You Were Waiting For?
Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.
This is awesome, outside-of-the-box thinking tech.
I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
What the hell is Mantle?
I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.
At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider.
Is it great for those who already happen to fall within the requirements? Sure, but unless Nvidia opens this up or competitors make similar solutions, I feel like this is doomed to be as niche as lightboost, Physx, and, I suspect, Mantle.
Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.
This is awesome, outside-of-the-box thinking tech.
I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
You mention it being smooth when set to 144hz with Gsync, is there any way you cap the display at 64hz and try it with Gsync alone (iPresentinterval=0) and see what happens then? Just wondering if the game is at fault here and if that specific issue is still there in their latest version of the engine.
Alternatively I suppose you could load up Fallout 3 or NV instead and see if the Gsync results match Skyrim.
Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.
G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!
Get it standardised and into the DVI/HDMI/DP specs, then it'll take off.
I wonder if you could just add a flag for variable vertical blanks, and have it send a 'starting next frame' sequence whenever a frame is rendered.
If it's not included by default in monitors, it'll become the next PhysX. And to do that it has to be platform-agnostic.
oh really? I envy your eyes.
Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.
G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!
Considering mantle, what does GPU performance matter on a screen with input lag or screen with tearing, choppy and blurry video?
Mantle will not solve this problem. Mantle is supposed to be more of a low-level common API with enhanced GPU performance as a possible advantage. I'm not sure that even compares to what's being discussed here. Maybe I'm way off???
G-sync will eliminate input lag, tearing and blur and as a result add to the overall realism of the gaming experience.
Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.
G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!
the monitor might be expensive right now but it will be good investment if you decide to go that route. at the very least you don't upgrade your monitor as often as gpu. my current monitor has been paired with GTS250, GTX460 and now GTX660 SLI. the only downside is it will locked you to use nvidia gpu only.