Nvidia G-SYNC Fixes Screen Tearing in Kepler-based Games
Nvidia has introduced a Kepler-powered chip that fixes V-Sync problems in monitors and games.
Nvidia's Tom Peterson has updated the company's blog with news of a new technology aimed to fix the problems related to V-SYNC. With it off, many games have fast frame rates but suffer annoying, visual tearing, whereas when V-SYNC is turned on, many games stutter and lag. This has been a problem since the early '90s; the GPU company has embarked on a way to figure out a way around stuttering problems related to monitor refresh rates.
"We brought together about two dozen GPU architects and other senior guys to take apart the problem and look at why some games are smooth and others aren't," he writes. "It turns out that our entire industry has been syncing the GPU's frame-rendering rate to the monitor's refresh rate – usually 60 Hz – and it's this syncing that's causing a lot of the problems."
He said that because of historic reasons, monitors have fixed refresh rates at 60 Hz. That's because PC monitors originally used TV components, and here in the States, 60 Hz has been the standard TV refresh rate since the 1940s. Back then, the U.S. power grid was based on a 60 Hz AC power grid, thus setting the TV refresh rate to the same 60 Hz that made it easier to build for TVs. The PC industry simply inherited the refresh rate.
That's where the G-SYNC module comes in. The device will be built inside a display and work with the hardware and software in certain Kepler GeForce GTX GPUs. Thanks to the module, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. And because the GPU renders with variable time, the refresh of the monitor now no longer has a fixed rate.
"This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU," Peterson writes. "So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC."
According to Nvidia's separate press release, many monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic. Compatible Nvidia GPUs include GTX 650 Ti Boost, GTX 660, GTX 660 Ti, GTX 670, GTX 680, GTX 690, GTX 760, GTX 770, GTX 780 and GTX TITAN. The driver requirement is R331.58 or higher.
"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter," said John Carmack, co-founder, iD Software.
Follow us @tomshardware, on Facebook and on Google+.
Will 20 fps on 20hz g-sync'ed monitor look as smooth as 60 fps on regular monitor?
You can see it in action here:
http://www.guru3d.com/news_story/nvidia_announced_g_sync_eliminates_stutter_and_screen_tearing_with_a_daughter_module.html
Although from what I've heard the difference isn't as apparent in the video as it is in person. It's still pretty impressive though.
Will 20 fps on 20hz g-sync'ed monitor look as smooth as 60 fps on regular monitor?
Of course at a certain point a game won't look smooth no matter what. At 5 fps rendered graphics are going to look like a slideshow regardless (although those will be some clean, tear-free 5 frames per second), but I think G-Sync does give you much more flexibility with low frame-rates, but again only to a certain point. I'm willing to bet you could probably go as low as ~24 fps and still have a perfectly 'smooth' experience.
Will 20 fps on 20hz g-sync'ed monitor look as smooth as 60 fps on regular monitor?
no, 20fps is still 20fps, it can never look like 60fps. Game's fps is always variable, so with normal vsync, with a 60hz monitor for example, at 20fps, the monitor refreshes 3 times for one frame displayed, which is fine. But at 23fps, the monitor cant display this with vsync on, 23 can not be divided into 60 evenly, so it drops down to 20fps to keep frames synced with the monitor, so you get a slight delay in the game engine before the frame is released to the monitor to keep things in sync (this is how you get stutter and input lag from your mouse/keyboard). Compare this with gsync, a constant variable refresh rate synced with the frame rate there is no delay, no fps drops, no stuttering, the game engine/rendering doesnt have to slow down to keep things synced, so having similar fps will feel way smoother with g-sync. Personally, i cant wait to get a monitor that supports this, so long as they aren't ridiculously priced.
I would say below 50hz/fps with lightboost, things would become very flickery, so it probably isn't a good idea unless fps is high.
Sounds like what Gsync/Nvidia is trying to accomplish should have been done generically, no?
Or their own proprietary API?
With that being said don't bash me for being a AMD fan boy because I run 2xGTX680s. They should work on things that don't increase price to consumers, like an nVidia alternative to AMD's Mantle. I'd love to reap the benefits of that with my current hardware.
There's nothing stopping AMD from doing the same thing, except the development and implementation cost of course.
So why don't you just develop the hardware and or software required and then give it away for free? That way everyone will be happy and you'll get what you want.
They don't. GSync is what VSync should have been.
VSync with a 60 Hz monitor
GPU FPS: 61 / 60 / 59
Actual FPS: 60/ 60 / 30
GSync with a 60 Hz monitor
GPU FPS: 61 / 60 / 59
Actual FPS: 60 / 60 / 59