3D LightBoost, On-Board Memory, Standards, And 4K
As we were going through Nvidia's press material, we found ourselves asking a number of questions about G-Sync as a technology today, along with its role in the future. During a recent trip to the company's headquarters in Santa Clara, we were able to get some answers.
G-Sync And 3D LightBoost
The first thing we noticed was that Nvidia was sending out that Asus VG248QE monitor, modified to support G-Sync. That monitor also supports what Nvidia currently calls 3D LightBoost technology, which was originally introduced to improve brightness in 3D displays, but has long been unofficially used in 2D mode as well, using its panel-pulsing backlight to reduce the ghosting (or motion blur) artifact we mentioned on page one. Naturally, we wanted to know if that could be used with G-Sync.
Nvidia answered that no, although using both technologies at the same time is what you'd want ideally, today, strobing the backlight at a variable refresh currently results in flicker and brightness issues. Solving them is incredibly complex, since you have to adjust luminance and keep track of pulses. As a result, you currently have to choose between the two technologies, although the company is working on a way to use them together in the future.
The G-Sync Module's On-Board Memory
As we already know, G-Sync eliminates the incremental input lag associated with V-sync, since there's no longer a need to wait for the panel to scan. However, we noticed that the G-Sync module has on-board memory. Could the module be buffering frames itself? If so, how much time would it take for a frame to make its way through the new pipeline?
According to Nvidia, frames are never buffered in the module's memory. As data comes in, it's displayed on the screen, and the memory does perform several other roles. However, the processing time related to G-Sync is way less than one millisecond. In fact, this is roughly the same latency encountered with V-sync off, and is related to the game, graphics driver, the mouse, and so on.
Will G-Sync Ever Be Standardized?
This came up during a recent AMA with AMD when a reader wanted to get the company's reaction to G-Sync. However, we also wanted to follow up with Nvidia directly to see if the company had any plans to push its technology as an industry standard. In theory, it could propose G-Sync as an update to the DisplayPort standard, exposing variable refresh rates. Nvidia is a member of VESA, the industry's main standard-setting board, after all.
Simply, there are no plans to introduce a new spec to DisplayPort, HDMI, or DVI. G-Sync is already working with DisplayPort 1.2, meaning there's no need for a standards change.
As mentioned, Nvidia is working to make G-Sync compatible with what it currently refers to as 3D LightBoost (and will be called something else soon). It's also trying to bring the module's cost down to make G-Sync more accessible.
G-Sync At Ultra HD Resolutions
Nvidia's online FAQ promises G-Sync-capable monitors with resolutions as high as 3840x2160. However, the Asus model we're previewing today maxes out at 1920x1080. Currently, the only Ultra HD monitors employ STMicro's Athena controller, which uses two scalers to create a tiled display. We were curious, then: does the G-Sync module support an MST configuration?
In truth, we'll be waiting a while before 4K displays show up with variable refresh rates. Today, there is no single scaler able to support 4K resolutions, and the soonest that's expected to arrive is Q1 of 2014, with monitors including it showing up in Q2. Because the G-Sync module replaces the scaler, compatible panels would only start surfacing sometime after that point. Fortunately, the module does natively support Ultra HD.
Keeping Up: What Happens Under 30 Hz?
The variable refresh enabled by G-Sync works down to 30 Hz. The explanation for this is that, at very low refresh rates, an image on an LCD starts to decay after a while and you end up with visual artifacts. If your source drops below 30 FPS, the module knows to refresh the panel automatically, avoiding those issues. That might mean displaying the same image more than once, but the threshold is set to 30 Hz to maintain the best-looking experience possible.