Sign in with
Sign up | Sign in

Nvidia G-SYNC Fixes Screen Tearing in Kepler-based Games

By - Source: Nvidia | B 67 comments

Nvidia has introduced a Kepler-powered chip that fixes V-Sync problems in monitors and games.

Nvidia's Tom Peterson has updated the company's blog with news of a new technology aimed to fix the problems related to V-SYNC. With it off, many games have fast frame rates but suffer annoying, visual tearing, whereas when V-SYNC is turned on, many games stutter and lag. This has been a problem since the early '90s; the GPU company has embarked on a way to figure out a way around stuttering problems related to monitor refresh rates.

"We brought together about two dozen GPU architects and other senior guys to take apart the problem and look at why some games are smooth and others aren't," he writes. "It turns out that our entire industry has been syncing the GPU's frame-rendering rate to the monitor's refresh rate – usually 60 Hz – and it's this syncing that's causing a lot of the problems."

He said that because of historic reasons, monitors have fixed refresh rates at 60 Hz. That's because PC monitors originally used TV components, and here in the States, 60 Hz has been the standard TV refresh rate since the 1940s. Back then, the U.S. power grid was based on a 60 Hz AC power grid, thus setting the TV refresh rate to the same 60 Hz that made it easier to build for TVs. The PC industry simply inherited the refresh rate.

That's where the G-SYNC module comes in. The device will be built inside a display and work with the hardware and software in certain Kepler GeForce GTX GPUs. Thanks to the module, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. And because the GPU renders with variable time, the refresh of the monitor now no longer has a fixed rate.

"This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU," Peterson writes. "So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC."

According to Nvidia's separate press release, many monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic. Compatible Nvidia GPUs include GTX 650 Ti Boost, GTX 660, GTX 660 Ti, GTX 670, GTX 680, GTX 690, GTX 760, GTX 770, GTX 780 and GTX TITAN. The driver requirement is R331.58 or higher.

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter," said John Carmack, co-founder, iD Software.

Follow us @tomshardware, on Facebook and on Google+.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
  • 0 Hide
    monsta , October 18, 2013 7:22 PM
    Gonna hold off the monitor upgrade and wait to see this in action.
  • 6 Hide
    dragonsqrrl , October 18, 2013 7:27 PM
    I love how Nvidia has put so much effort over the past generation into not just the raw performance of a real-time experience, but also the quality of that real-time experience. While this is a solution that ultimately eliminates micro-stutter from real-time rendering entirely, I don't think it's a practical solution to the problem, yet.
  • -7 Hide
    Sid Jeong , October 18, 2013 7:28 PM
    Does this mean we won't have to care about fps anymore?
    Will 20 fps on 20hz g-sync'ed monitor look as smooth as 60 fps on regular monitor?
  • Display all 67 comments.
  • 1 Hide
    dragonsqrrl , October 18, 2013 7:33 PM
    Quote:
    Gonna hold off the monitor upgrade and wait to see this in action.


    You can see it in action here:

    http://www.guru3d.com/news_story/nvidia_announced_g_sync_eliminates_stutter_and_screen_tearing_with_a_daughter_module.html


    Although from what I've heard the difference isn't as apparent in the video as it is in person. It's still pretty impressive though.
  • -5 Hide
    Nilo BP , October 18, 2013 7:34 PM
    Call me crazy, but the only game in which I remember getting uncomfortable with screen tearing was the first Witcher. Everything else? Just turn off VSync and go on my merry way. From humble CS:GO to Tomb Raider with everything jacked up.
  • 6 Hide
    dragonsqrrl , October 18, 2013 7:45 PM
    Quote:
    Does this mean we won't have to care about fps anymore?
    Will 20 fps on 20hz g-sync'ed monitor look as smooth as 60 fps on regular monitor?


    Of course at a certain point a game won't look smooth no matter what. At 5 fps rendered graphics are going to look like a slideshow regardless (although those will be some clean, tear-free 5 frames per second), but I think G-Sync does give you much more flexibility with low frame-rates, but again only to a certain point. I'm willing to bet you could probably go as low as ~24 fps and still have a perfectly 'smooth' experience.
  • 1 Hide
    drezzz , October 18, 2013 8:03 PM
    Another question is will this interfere with the lightboost hack or does this make lightboost hack unnecessary?
  • 4 Hide
    iam2thecrowe , October 18, 2013 8:07 PM
    Quote:
    Does this mean we won't have to care about fps anymore?
    Will 20 fps on 20hz g-sync'ed monitor look as smooth as 60 fps on regular monitor?


    no, 20fps is still 20fps, it can never look like 60fps. Game's fps is always variable, so with normal vsync, with a 60hz monitor for example, at 20fps, the monitor refreshes 3 times for one frame displayed, which is fine. But at 23fps, the monitor cant display this with vsync on, 23 can not be divided into 60 evenly, so it drops down to 20fps to keep frames synced with the monitor, so you get a slight delay in the game engine before the frame is released to the monitor to keep things in sync (this is how you get stutter and input lag from your mouse/keyboard). Compare this with gsync, a constant variable refresh rate synced with the frame rate there is no delay, no fps drops, no stuttering, the game engine/rendering doesnt have to slow down to keep things synced, so having similar fps will feel way smoother with g-sync. Personally, i cant wait to get a monitor that supports this, so long as they aren't ridiculously priced.
  • 1 Hide
    iam2thecrowe , October 18, 2013 8:10 PM
    Quote:
    Another question is will this interfere with the lightboost hack or does this make lightboost hack unnecessary? Very happy that I bought the Asus over the Benq in any case.

    I would say below 50hz/fps with lightboost, things would become very flickery, so it probably isn't a good idea unless fps is high.
  • 3 Hide
    boju , October 18, 2013 8:24 PM
    What's AMD going to do, implant their own chips as well to compete? do they have an alternative im not aware about?

    Sounds like what Gsync/Nvidia is trying to accomplish should have been done generically, no?
  • -1 Hide
    eodeo , October 18, 2013 8:41 PM
    How do they overcome the actual 60fps hardware limitation of monitors? Did I miss it?
  • 1 Hide
    RupertJr , October 18, 2013 8:45 PM
    indeed! boju this should be a standard and not tied to a hardware vendor. having implemented g-sync in this way is lame, unless they make it an open standard. PC industry evolved a lot thanks to standarization. Could you imagine what would have been if each vendor implemented their own hardware standards?
  • 3 Hide
    dragonsqrrl , October 18, 2013 9:17 PM
    Quote:
    Could you imagine what would have been if each vendor implemented their own hardware standards?


    Or their own proprietary API?
  • 0 Hide
    tomfreak , October 18, 2013 10:11 PM
    the technology is on the monitor, but it is locked on 1 GPU vendor.....right..lol....
  • 1 Hide
    joaompp , October 18, 2013 10:12 PM
    While this feature is great, not many people are going to shell out extra dough for that extra chip inside the monitor, it's just going to drive up monitor for a specific brand of video cards, it's a way for them to lock people into there eco system. This is going to be like their over priced 3D monitors all over again.

    With that being said don't bash me for being a AMD fan boy because I run 2xGTX680s. They should work on things that don't increase price to consumers, like an nVidia alternative to AMD's Mantle. I'd love to reap the benefits of that with my current hardware.
  • 1 Hide
    invlem , October 18, 2013 10:21 PM
    More proprietary crap that only works on specific hardware configurations... no thanks.
  • 0 Hide
    Mousemonkey , October 18, 2013 10:22 PM
    Quote:
    indeed! boju this should be a standard and not tied to a hardware vendor. having implemented g-sync in this way is lame, unless they make it an open standard. PC industry evolved a lot thanks to standarization. Could you imagine what would have been if each vendor implemented their own hardware standards?


    Quote:
    the technology is on the monitor, but it is locked on 1 GPU vendor.....right..lol....


    There's nothing stopping AMD from doing the same thing, except the development and implementation cost of course.
  • -3 Hide
    Tanquen , October 18, 2013 10:42 PM
    If they know the monitor is running at XXHz then why don’t they just make the card send a frame in sync with the monitor? You have a buffer that is read to send out a full frame even if it has not been updated or if it has been update 1-2-3-4 or more times since the last read. Just read and send the full frame each time in sync with the monitor. It just seems like they are more interested in software and hardware to sell and license.
  • 0 Hide
    Mousemonkey , October 18, 2013 10:48 PM
    Quote:
    If they know the monitor is running at XXHz then why don’t they just make the card send a frame in sync with the monitor? You have a buffer that is read to send out a full frame even if it has not been updated or if it has been update 1-2-3-4 or more times since the last read. Just read and send the full frame each time in sync with the monitor. It just seems like they are more interested in software and hardware to sell and license.


    So why don't you just develop the hardware and or software required and then give it away for free? That way everyone will be happy and you'll get what you want.
  • 1 Hide
    knightmike , October 18, 2013 11:01 PM
    Quote:
    How do they overcome the actual 60fps hardware limitation of monitors? Did I miss it?


    They don't. GSync is what VSync should have been.

    VSync with a 60 Hz monitor
    GPU FPS: 61 / 60 / 59
    Actual FPS: 60/ 60 / 30

    GSync with a 60 Hz monitor
    GPU FPS: 61 / 60 / 59
    Actual FPS: 60 / 60 / 59
Display more comments