Sign in with
Sign up | Sign in
Your question

G-Sync Technology Preview: Quite Literally A Game Changer

Last response: in Reviews comments
Share
Anonymous
a b U Graphics card
a b C Monitor
December 12, 2013 6:00:03 AM

You've forever faced this dilemma: disable V-sync and live with image tearing, or turn V-sync on and tolerate the annoying stutter and lag? Nvidia promises to make that question obsolete with a variable refresh rate technology we're previewing today.

G-Sync Technology Preview: Quite Literally A Game Changer : Read more
a b U Graphics card
December 12, 2013 6:34:10 AM

I consider Gsync to be the most important gaming innovation since DX7. It's going to be one of those "How the HELL did we live without this before?" technologies.
a c 88 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 6:59:43 AM

Totally agree, G Sync is really impressive and the technology we have been waiting for.
What the hell is Mantle?
December 12, 2013 7:01:34 AM

I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.

I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.

At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider.
a b U Graphics card
a b C Monitor
December 12, 2013 7:11:11 AM

Competition, competition. Anybody who is flaming over who is better: AMD or nVidia, is clearly missing the point. With nVidia's G-Sync, and AMD's Mantle, we have, for the first time in a while, real market competition in the GPU space. What does that mean for consumers? Lower prices, better products.
December 12, 2013 7:13:04 AM

This needs to be not so proprietary for it to become a game changer. As it is, requiring a specific GPU and specific monitor with an additional price premium just isn't compelling and won't reach a wide demographic.

Is it great for those who already happen to fall within the requirements? Sure, but unless Nvidia opens this up or competitors make similar solutions, I feel like this is doomed to be as niche as lightboost, Physx, and, I suspect, Mantle.
December 12, 2013 7:21:36 AM

g sync tv pleeeeeeeeeease
a c 124 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 7:22:37 AM

I'm on page 4, and I can't even contain myself.

Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.

This is awesome, outside-of-the-box thinking tech.

I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
December 12, 2013 7:25:25 AM

Could the Skyrim stuttering at 60hz w/ Gsync be because the engine operates internally at 64hz? All those Bethesda tech games drop 4 frames every second when vsync'd to 60hz which cause that severe microstutter you see on nearby floors and walls when moving and strafing. Same thing happened in Oblivion, Fallout 3, and New Vegas on PC. You had to use stutter removal mods in conjunction with the script extenders to actually force the game to operate at 60hz and smooth it out with vsync on.

You mention it being smooth when set to 144hz with Gsync, is there any way you cap the display at 64hz and try it with Gsync alone (iPresentinterval=0) and see what happens then? Just wondering if the game is at fault here and if that specific issue is still there in their latest version of the engine.

Alternatively I suppose you could load up Fallout 3 or NV instead and see if the Gsync results match Skyrim.
December 12, 2013 7:25:40 AM

I would be excited for this if it werent for Oculus Rift. I don't mean to be dismissive, this looks awesome...but it isn't Oculus Rift.
December 12, 2013 7:28:22 AM

Am I the only one who has never experienced screen tearing? Most of my games run past my refresh rate too....
December 12, 2013 7:45:53 AM

i play alot of mmo and tearing apear all the time . this is nice tech i cant wait for 2560x1440. whit G-Sync
December 12, 2013 7:46:32 AM

TLM: Physx is not that way anymore. The newest gen of consoles can take advantage of Physx now and I'm sure will. That's hardly niche.
December 12, 2013 7:54:45 AM

I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot, just like phys-x there are some people that care about it but most don`t even ecknowledge it, technically it can run on anything, practically only on nvidia and being a monitor side gimmick i would see this as a monitor company tech thing not graphics card maker closed stuff. I`m more interested in Mantle than this, since it preaches about better multicore cpu performance and better fps on your hardware.

Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.

G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!
December 12, 2013 7:55:02 AM

You guys should've tried Dead Island with G-Sync, the tearing in this game was really horrible.
December 12, 2013 7:55:57 AM

I cant wait for G sync on 2560x1440
a c 96 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 7:59:05 AM

Needs to be open. If it's not in a standard, it won't take off. How many people want to rip open a display (likely voiding warranty) to install a pricey addon card, and then be strongly limited to what other equipment they can use it with.

Get it standardised and into the DVI/HDMI/DP specs, then it'll take off.

I wonder if you could just add a flag for variable vertical blanks, and have it send a 'starting next frame' sequence whenever a frame is rendered.

If it's not included by default in monitors, it'll become the next PhysX. And to do that it has to be platform-agnostic.
December 12, 2013 8:01:06 AM

wurkfur said:
I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.


oh really? I envy your eyes.
a c 124 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 8:09:06 AM

ohim said:
I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot, just like phys-x there are some people that care about it but most don`t even ecknowledge it, technically it can run on anything, practically only on nvidia and being a monitor side gimmick i would see this as a monitor company tech thing not graphics card maker closed stuff. I`m more interested in Mantle than this, since it preaches about better multicore cpu performance and better fps on your hardware.

Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.

G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!


Considering mantle, what does GPU performance matter on a screen with input lag or screen with tearing, choppy and blurry video?

Mantle will not solve this problem. Mantle is supposed to be more of a low-level common API with enhanced GPU performance as a possible advantage. I'm not sure that even compares to what's being discussed here. Maybe I'm way off???

G-sync will eliminate input lag, tearing and blur and as a result add to the overall realism of the gaming experience.
December 12, 2013 8:27:57 AM

This will become more important as we migrate to 4k displays. At that resolution, maintaining very high frame rates will become more difficult. Allowing a better experience at lower frame rates will become more important and more valuable.
a c 141 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 8:47:49 AM

ohim said:
I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot, just like phys-x there are some people that care about it but most don`t even ecknowledge it, technically it can run on anything, practically only on nvidia and being a monitor side gimmick i would see this as a monitor company tech thing not graphics card maker closed stuff. I`m more interested in Mantle than this, since it preaches about better multicore cpu performance and better fps on your hardware.

Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.

G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!


the monitor might be expensive right now but it will be good investment if you decide to go that route. at the very least you don't upgrade your monitor as often as gpu. my current monitor has been paired with GTS250, GTX460 and now GTX660 SLI. the only downside is it will locked you to use nvidia gpu only.
December 12, 2013 9:02:35 AM

If you're already running a 120/144hz monitor, this technology won't be that great or really noticeable. I honestly cannot spot any noticeable tearing or ghosting on my Benq monitor and I would question anyone if they claim they could. With 60Hz monitors this will be great but expect this technology but expect it to be extremely expensive (to the levels of 120/144hz monitors). I think this will be a game changer once the technology becomes cheaper and becomes a standard.I expect this will initially be just for those enthusiasts that have money to blow. I'm basing this completely on this review with these quotes especially being important.

"Now, it's important to understand where G-Sync does and does not yield the most significant impact. There's a good chance you're currently using a screen that operates at 60 Hz. Faster 120 and 144 Hz refresh rates are popular amongst gamers, but Nvidia is (rightly) predicting that its biggest market will be the enthusiasts still stuck at 60 Hz."

"Of course, as you shift over into real-world gaming, the impact is typically less binary. There are shades of "Whoa!" and "That's crazy" on one end of the spectrum and "I think I see the difference" on the other.... In certain cases, just the shift from 60 to 144 Hz is what will hit you as most effective, particularly if you can push those really high frame rates from a high-end graphics subsystem."
a c 91 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 9:12:58 AM

this is truly a beautiful advancement.

now all we need is true implementation. Because G-sync literally requires a hardware component in the monitor, we need a wide range of monitors to implement this, from your everyday cheap LCD panel to the color accurate IPS panels.

that aside, I know it's too much to hope for, but can Nvidia not lock down these nice things to their own GPUs? c'mon, open it up to AMD and (god forbid) intel's iGPUs
a c 91 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 9:15:11 AM

expl0itfinder said:
Competition, competition. Anybody who is flaming over who is better: AMD or nVidia, is clearly missing the point. With nVidia's G-Sync, and AMD's Mantle, we have, for the first time in a while, real market competition in the GPU space. What does that mean for consumers? Lower prices, better products.


however, this kind of competition does not lead to lower prices and better products. it can only lead to fanboism, increased marketing costs (to promote their own 'special thing'), and an increasingly polarized market. I'm glad that AMD is willing to open up Mantle to other companies. Nvidia needs to do the same with this G-sync technology
December 12, 2013 9:23:04 AM

pricetag_geek said:
wurkfur said:
I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.


oh really? I envy your eyes.


Ahh, to be blind.
December 12, 2013 9:37:48 AM

I still don’t get why we need it. Many programs, movies, games are silky smooth with no need for extra hardware in the monitor. The monitor don’t care, it’s the cards that’s sending a messed up frame. You know the monitor wants a frame ever second or so depending on the refresh rate. Buffer A and B are always building frames. Only one is ready every 60th or 120th of a second. Send the one that is ready. Time to send a new frame and a new one is not ready? Send the same one again! “No, it’s better to have the monitor changes it frame rate on the fly?” What? It’s the same thing. Slow the monitor down so it’s still showing the last frame. This should all be handled in the card but then they could not license a monitor to you too.
a c 216 U Graphics card
a c 123 C Monitor
a c 80 Î Nvidia
December 12, 2013 9:45:00 AM

Quote:
wurkfur wrote:
I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.

I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.

At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider.

A big part of 120hz is not what it looks like, but latency it reduces. You feel much more connected to the view in 1st person games. Just watching won't give you the difference. You also need to have good FPS as well.
a b U Graphics card
December 12, 2013 10:04:18 AM

Tanquen said:
I still don’t get why we need it. Many programs, movies, games are silky smooth with no need for extra hardware in the monitor. The monitor don’t care, it’s the cards that’s sending a messed up frame. You know the monitor wants a frame ever second or so depending on the refresh rate. Buffer A and B are always building frames. Only one is ready every 60th or 120th of a second. Send the one that is ready. Time to send a new frame and a new one is not ready? Send the same one again! “No, it’s better to have the monitor changes it frame rate on the fly?” What? It’s the same thing. Slow the monitor down so it’s still showing the last frame. This should all be handled in the card but then they could not license a monitor to you too.


If you truly believe that, then you don't understand.

Currently the choice is either have it wait for the next refresh displaying the same frame, which causes huge discrepancies in your frame rate and that causes you to see judder or you can decide to just draw the frame when it's ready even if the monitor is in the middle of drawing another frame, which causes the screen to tear. There is no option that causes silky-smoothness, except now there is with G-Sync.

The reason movies look fine at 24fps and 30fps is all motion blur that's accurately recorded on the film used. So you aren't actually seeing silky smooth motion but instead a blurry mess of the motion that happened during the 41ms or 33ms that the film captured that your brain is pretty good at putting together as motion. Though, if you run a 24p movie on a 60hz TV you also get that kind of unnatural feeling motion, because some frames last longer than others causing judder. Hence why 120Hz TV's attempt to fix it since 120Hz is a multiple of both 24 and 30 making it to the frames can always be displayed for 5 or 4 scans respectively (or they use the motion compensation to interpolate frames between the two actual frames . . . essentially making a new frames that are a blend of the one previous and next recorded ones - which sometimes works but sometimes looks awful).

Also, G-Sync will reduce the amount of lag you feel because you don't need triple buffering enabled while v-sync is on in order to keep frames flowing as smoothly as possible. Currently if you have V-Sync and triple buffering on using a 60hz screen then the frame you are seeing on the screen is a frame that was generated 33ms ago, there is one in the buffer made 16ms ago, and one being worked on. That means if something happens in game and appears on your screen you won't see it for 33ms. G-Sync solves that problem by always flipping to the new frame when it's ready. How this translates to real world performance is that if a gamer can react to something on screen in 120ms and two of these people are side by side, the one with G-Sync will be able to react in that 120ms, while the non-G-Sync user waits for 33ms first and then gets to react causing the time to be 153ms . . . that's over 20% faster. If you don't think that's a big deal then you obviously don't play any games where reaction times have any bearing on the outcome.
December 12, 2013 10:16:11 AM

I never turn off V-sync, because the tearing makes me nauseous. I'll take bad stutter over nausea any day. But I remember reading recently about another method for fixing this problem which reduces detail of fast-moving artifacts to maintain a minimum framerate. This makes a lot of sense: I don't need a highly detailed rendering of something which is moving too fast to see clearly anyway. And in a fast-moving scene, I will probably not even be looking at the static elements, and so they can be slight less detailed as well. But I can't find the article. Does anyone recognize it?
December 12, 2013 10:18:16 AM

The maximum stutter (and input lag) difference with G-Sync and triple-buffered V-sync is less than the duration of 1 frame (~16.666ms @ 60Hz or ~8.333 @ 120Hz).

The advantage of G-Sync is smoother look at low fps due to consistent frame time. You would hardly see any difference between G-Sync and V-sync w/triple buffering at higher refresh rates (100+). The problem with triple buffering is implementation.

If you can't get past 60 fps, you're better off upgrading your system (or dialing down the settings) than replacing your monitor... unless you're in the market for a new monitor and the price difference between a monitor with and without G-Sync is small, then why not?

EDIT: If G-Sync will add $100+ to the price (vs without) then it's better to put it on higher resolution screens (1440p+) to make game play smoother at below 60 fps (but higher than 40) since it's harder to crank up fps at higher resolutions.
a b U Graphics card
a b C Monitor
December 12, 2013 10:24:09 AM

If they won't open up the tech for everyone (not only license, but make it PART of the standards), then I'll keep my pocket closed as well, with my money inside.

There's no way i'll trade a little tearing on my 120Hz Monitor over the incapability to choose between AMD and nVidia (be hardware locked) as my next video card. That's is an stupid as hell compromise I will not take.

Cheers!
December 12, 2013 10:25:38 AM

Yeah, this is truly a huge innovation. I never game with V-sync on, the input lag is actually really noticeable, especially when playing games like CS, SC or Dota.
December 12, 2013 10:31:14 AM

My next monitor will be a 4k screen with Gsync

In 2014 pretty please? ASUS are you listening? Get this into your 39" 4k screen!
December 12, 2013 10:45:51 AM

This reminds me of Lucid's Virtual Vsync. Only, Lucid's solution was purely software based... and open enough that I think it was even licensed to next gen consoles.

Meanwhile this requires an Nvida GPU and a Monitor that has an Nvida chip in it. I don't really get the excitement over this, to be honest.

Meanwhile, if someone didn't want to use Lucid's solution, or it didn't work well with a specific game... simply forcing triple buttering with Vsync on, works great in most situations. So long as your FPS is at a decent level, the latency introduced isn't even noticeable... and you're less likely to run into issues than if you tried to use G-Sync.

Waste of time technology IMHO. Especially if Nvidia keeps it closed off.
a c 151 U Graphics card
a b C Monitor
a b Î Nvidia
December 12, 2013 10:46:00 AM

Well damn, I take back ANY doubt. This is awesome from just the read. ALmost make me want to get onto the GTX780 band wagon.
December 12, 2013 10:46:51 AM

Traciatim said:
Tanquen said:
I still don’t get why we need it. Many programs, movies, games are silky smooth with no need for extra hardware in the monitor. The monitor don’t care, it’s the cards that’s sending a messed up frame. You know the monitor wants a frame ever second or so depending on the refresh rate. Buffer A and B are always building frames. Only one is ready every 60th or 120th of a second. Send the one that is ready. Time to send a new frame and a new one is not ready? Send the same one again! “No, it’s better to have the monitor changes it frame rate on the fly?” What? It’s the same thing. Slow the monitor down so it’s still showing the last frame. This should all be handled in the card but then they could not license a monitor to you too.


If you truly believe that, then you don't understand.

Currently the choice is either have it wait for the next refresh displaying the same frame, which causes huge discrepancies in your frame rate and that causes you to see judder or you can decide to just draw the frame when it's ready even if the monitor is in the middle of drawing another frame, which causes the screen to tear. There is no option that causes silky-smoothness, except now there is with G-Sync.

The reason movies look fine at 24fps and 30fps is all motion blur that's accurately recorded on the film used. So you aren't actually seeing silky smooth motion but instead a blurry mess of the motion that happened during the 41ms or 33ms that the film captured that your brain is pretty good at putting together as motion. Though, if you run a 24p movie on a 60hz TV you also get that kind of unnatural feeling motion, because some frames last longer than others causing judder. Hence why 120Hz TV's attempt to fix it since 120Hz is a multiple of both 24 and 30 making it to the frames can always be displayed for 5 or 4 scans respectively (or they use the motion compensation to interpolate frames between the two actual frames . . . essentially making a new frames that are a blend of the one previous and next recorded ones - which sometimes works but sometimes looks awful).

Also, G-Sync will reduce the amount of lag you feel because you don't need triple buffering enabled while v-sync is on in order to keep frames flowing as smoothly as possible. Currently if you have V-Sync and triple buffering on using a 60hz screen then the frame you are seeing on the screen is a frame that was generated 33ms ago, there is one in the buffer made 16ms ago, and one being worked on. That means if something happens in game and appears on your screen you won't see it for 33ms. G-Sync solves that problem by always flipping to the new frame when it's ready. How this translates to real world performance is that if a gamer can react to something on screen in 120ms and two of these people are side by side, the one with G-Sync will be able to react in that 120ms, while the non-G-Sync user waits for 33ms first and then gets to react causing the time to be 153ms . . . that's over 20% faster. If you don't think that's a big deal then you obviously don't play any games where reaction times have any bearing on the outcome.


Low frame rates with motion blur are not the issue. Sending the same frame 5 times then the next 10 times then next 1 time and then the next 25 times and or with half of one from and half of another (I think) is. There is no reason this could not be handled in the card. The review and NVidia even say it’s not going to work for everything. The monitor is just displaying the same thing over and over until the onboard chipset driver tells it to change something. You have the max refresh/frequency of the light source and the max refresh of the Liquid Crystal Display element and then the chipset driver. The artifacts are caused by the messed up frames from the card. Telling the monitors onboard chipset driver to wait vs. sending the complete frame again.

“G-Sync will be able to react in that 120ms, while the non-G-Sync user waits for 33ms first and then gets to react causing “ G-Sync is not magical making more frames. You have to wait for the next one to be ready somewhere.
I’ve played lots of games and some have lots of mouse lag and some have very little or none that I can see. Some have lots of tearing and some have none that I can see. Some games with V-Sync on are really bad, some look/feel great. These variances all exist without G-sync.

If it was not proprietary, didn’t add more hardware, fixed everything, was free, open, sounds good. This is not that. I just hope AMD or someone that knows what they are doing comes up V-Sync 2.0 that just works.
December 12, 2013 10:51:29 AM

Until they add it to 2560x1600 panels and 4k res panels, I don't care.

Anyone complaining about 'tearing/stuttering' and they're using 1080 are idiotic.
December 12, 2013 10:55:25 AM

the 248 is the highest selling gaming monitor on the planet - Nvidia is clearly going after that market since their kits will be released soon as well. N is expecting people to shell out bucks to upgrade their exisitng monitor - although new ones are right around the corner.
December 12, 2013 10:56:52 AM

the 248 is the highest selling gaming monitor on the planet - Nvidia is clearly going after that market since their kits will be released soon as well. N is expecting people to shell out bucks to upgrade their exisitng monitor - although new ones are right around the corner.
December 12, 2013 11:02:31 AM

hysteria357 said:
Am I the only one who has never experienced screen tearing? Most of my games run past my refresh rate too....


Well the only explanation is that you don't know what screen tearing looks like or you've been playing every game with v-sync on.

ohim said:
I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot...


I don't think you understand what screen tearing actually is. The people who are affected by screen tearing are the ones who surpass the monitor's refresh rate, which normally is 60hz, which in this day and age is almost everyone. Some people are not bothered by screen tearing at all, some don't even know it exists but for me its annoying as hell. I do agree that Nvidia should make this a standard, but I don't know how their business works and given that they've been profitable, I'm sure they considered all the options.
December 12, 2013 11:25:38 AM

Good to see that this actually works. Hopefully we will see a standardised version of this is some day... If not, well hopefully it wont be too expensive to the Nvidia users! It would not be nice to see them ripped of.
In best the case this can be really good invention, in bad case Apple like good but expensive closed experience.
December 12, 2013 11:26:15 AM

I wished amd and intel would work together on presenting an alternative to gsync, only with a broad opposition can we murder the vile vendor lock here. Hopefully such a standard will be adopted by nvidea in the end and only then will we get broad adoption.
December 12, 2013 11:43:04 AM

The three main complaints:
- hardware locked (only nvidia GPUs)
- more expensive hardware (monitors)
- focused on image quality & user experience improvement other than performance improvement

These are the same cons that apple products have. !However, the reason that makes people buy a product is the user experience. If the hardware is stronger (more fps) but the experience (image & video quality) is crappy (tearing/lag/stuttering/ghosting) then there is no reason to pay for "crappy" high technology if there is an alternative. And considering that a monitor is meant to be something to have for quite a few years (just like an apple product), I think it's a very smart move from nvidia and in the right direction. What's missing is something like a "Steve Jobs" marketing campaign and then "BOOM" a new era for monitor technologies.

But we also need competition in order not to be raped by nvidia. So AMD needs to invent something similar and soon. AMD, if you are listening... HELP!!!
December 12, 2013 11:59:59 AM

INick said:

But we also need competition in order not to be raped by nvidia. So AMD needs to invent something similar and soon. AMD, if you are listening... HELP!!!


I think that Intel is in better position to this, to offer a alternative to this. Just because it is so much bigger. The good point for AMD is that they have been a little bit more keen on using open CL and those kind of tools.
Big part is allso those peoples who deside Display port standards and HDMI standards. One big solution is so much better for market, but allso we need pioneers and Nvidia is in that position now!
Who knows.. maybe G-sync will be killer stuff for Nvidia for next 2-3 years and after that it is possible that G-synk dies under the pressure of more open system that makes the same thing.
December 12, 2013 12:16:53 PM

I find it very pleasing for a new primitive tech like G-SYNC has the advantages of peak performance and less input lag without the tearing but there are some clear disadvantages too with its implementation like requiring Displayport 1.2 and with surround configuration. Also there is no where mentioned but I would like to ask this question will G-SYNC module be in operation for software media players so that we can see 24fps movies at 24hz because the current 60Hz monitors converts the 24fps into 60Hz internally. If NVIDIA can make this happen I'm expecting a slow but a gradual revolution.
December 12, 2013 12:51:35 PM

I keep throwing money at the screen, but nothing is happening.

December 12, 2013 1:22:31 PM

wurkfur: "I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.

I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.

At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider. "

I agree with this as I'm still growing accustomed to my Asus 27" 1440p monitor. BUT, aren't they making a separate module for G-Sync to buy separate and to use for legacy moniters that won't have it pre-installed? Thought NVIDIA stated that at one point? That being the case if benchmarks down the road show the same gain or close to the same gain as an integrated module built into the monitor I'd be more than content with purchasing that also?
December 12, 2013 1:33:02 PM

"Until now, your choices were V-sync on or V-sync off, each decision accompanied by compromises that detracted from the experience."

There is also the option to cap frame rate.
a c 271 U Graphics card
a b C Monitor
a c 168 Î Nvidia
December 12, 2013 1:41:52 PM

Someone Somewhere said:
Needs to be open. If it's not in a standard, it won't take off. How many people want to rip open a display (likely voiding warranty) to install a pricey addon card, and then be strongly limited to what other equipment they can use it with.

Get it standardised and into the DVI/HDMI/DP specs, then it'll take off.

I wonder if you could just add a flag for variable vertical blanks, and have it send a 'starting next frame' sequence whenever a frame is rendered.

If it's not included by default in monitors, it'll become the next PhysX. And to do that it has to be platform-agnostic.


IIRC in the original article about G-Sync there was mention that if a monitor was retro fitted it would also get a years warranty.
December 12, 2013 1:58:59 PM

WHY THE HELL is it cheaper to buy one of these screens in america, and ship them around the world to the netherlands, than to buy one here?!
importing:
$250 (€180) for the screen, $108 (€80) for shipping, so a total of $358, or €260
buying here:
€295 ($405!!!) for the screen, €5 ($6,50) for shipping, so a total of €300, or $412
how is this possible?
!