Nvidia G-SYNC Fixes Screen Tearing in Kepler-based Games

Status
Not open for further replies.

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
I love how Nvidia has put so much effort over the past generation into not just the raw performance of a real-time experience, but also the quality of that real-time experience. While this is a solution that ultimately eliminates micro-stutter from real-time rendering entirely, I don't think it's a practical solution to the problem, yet.
 

Sid Jeong

Honorable
Oct 13, 2013
19
0
10,510
Does this mean we won't have to care about fps anymore?
Will 20 fps on 20hz g-sync'ed monitor look as smooth as 60 fps on regular monitor?
 

Nilo BP

Honorable
Aug 24, 2013
31
0
10,530
Call me crazy, but the only game in which I remember getting uncomfortable with screen tearing was the first Witcher. Everything else? Just turn off VSync and go on my merry way. From humble CS:GO to Tomb Raider with everything jacked up.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290


Of course at a certain point a game won't look smooth no matter what. At 5 fps rendered graphics are going to look like a slideshow regardless (although those will be some clean, tear-free 5 frames per second), but I think G-Sync does give you much more flexibility with low frame-rates, but again only to a certain point. I'm willing to bet you could probably go as low as ~24 fps and still have a perfectly 'smooth' experience.
 

drezzz

Honorable
Oct 6, 2013
13
0
10,510
Another question is will this interfere with the lightboost hack or does this make lightboost hack unnecessary?
 


no, 20fps is still 20fps, it can never look like 60fps. Game's fps is always variable, so with normal vsync, with a 60hz monitor for example, at 20fps, the monitor refreshes 3 times for one frame displayed, which is fine. But at 23fps, the monitor cant display this with vsync on, 23 can not be divided into 60 evenly, so it drops down to 20fps to keep frames synced with the monitor, so you get a slight delay in the game engine before the frame is released to the monitor to keep things in sync (this is how you get stutter and input lag from your mouse/keyboard). Compare this with gsync, a constant variable refresh rate synced with the frame rate there is no delay, no fps drops, no stuttering, the game engine/rendering doesnt have to slow down to keep things synced, so having similar fps will feel way smoother with g-sync. Personally, i cant wait to get a monitor that supports this, so long as they aren't ridiculously priced.
 

I would say below 50hz/fps with lightboost, things would become very flickery, so it probably isn't a good idea unless fps is high.
 

boju

Titan
Ambassador
What's AMD going to do, implant their own chips as well to compete? do they have an alternative im not aware about?

Sounds like what Gsync/Nvidia is trying to accomplish should have been done generically, no?
 

RupertJr

Honorable
Mar 20, 2013
27
0
10,530
indeed! boju this should be a standard and not tied to a hardware vendor. having implemented g-sync in this way is lame, unless they make it an open standard. PC industry evolved a lot thanks to standarization. Could you imagine what would have been if each vendor implemented their own hardware standards?
 

joaompp

Distinguished
Apr 26, 2011
209
0
18,690
While this feature is great, not many people are going to shell out extra dough for that extra chip inside the monitor, it's just going to drive up monitor for a specific brand of video cards, it's a way for them to lock people into there eco system. This is going to be like their over priced 3D monitors all over again.

With that being said don't bash me for being a AMD fan boy because I run 2xGTX680s. They should work on things that don't increase price to consumers, like an nVidia alternative to AMD's Mantle. I'd love to reap the benefits of that with my current hardware.
 




There's nothing stopping AMD from doing the same thing, except the development and implementation cost of course.
 

Tanquen

Distinguished
Oct 20, 2008
256
8
18,785
If they know the monitor is running at XXHz then why don’t they just make the card send a frame in sync with the monitor? You have a buffer that is read to send out a full frame even if it has not been updated or if it has been update 1-2-3-4 or more times since the last read. Just read and send the full frame each time in sync with the monitor. It just seems like they are more interested in software and hardware to sell and license.
 


So why don't you just develop the hardware and or software required and then give it away for free? That way everyone will be happy and you'll get what you want.
 

knightmike

Distinguished
Jan 10, 2009
252
4
18,815


They don't. GSync is what VSync should have been.

VSync with a 60 Hz monitor
GPU FPS: 61 / 60 / 59
Actual FPS: 60/ 60 / 30

GSync with a 60 Hz monitor
GPU FPS: 61 / 60 / 59
Actual FPS: 60 / 60 / 59
 
This sounds like something that can be done via software. How hard is it to sync frames up to a fixed refresh cycle? It just sounds like the video card drivers need to be updated so that the video card only outputs frame when needed. This has been an issue for a long time and should not require additional hardware.
 

Sid Jeong

Honorable
Oct 13, 2013
19
0
10,510


I just watched demonstration video and it's very cool. Gotta wonder how things will work out since you need not only specific graphic cards, but specific monitors from particular brands.
 


you almost got it, except with the gsync the refresh rate is not fixed at 60hz, it matches whatever the fps is.
 
When I was using a 60Hz monitor, I sent feedback about the constant screen tearing while watching netflix. The streaming content is 24 fps, so you get tearing in scenes with fast movement. Not too long ago you remember seeing nvidia's driver updates include improvements for adobe flash video, but never any mention to Silverlight.

I wonder if this proprietary mess will help with that. Though, those are all beefy graphics cards for watching netflix lol

I'm highly skeptical about it *requiring* Kepler. That part kind of stinks.
 
Status
Not open for further replies.