VESA's Adaptive-Sync Certification Could Kill FreeSync, G-Sync Branding

Alienware AW3423DW
(Image credit: Dell)

When choosing among the best gaming monitors, it can be challenging to keep up with all the different features that add to performance. In addition to picking a display's size and resolution, you must keep track of Adaptive-Sync compatibility (Nvidia G-Sync versus AMD FreeSync) and the supported refresh rates on a given monitor. Response times are also a critical spec that display OEMs can often fudge to seem superior to the competition. 

To help simplify the research/purchase process and provide more transparency for monitor buyers, the Video Electronics Standard Association (VESA) has developed Adaptive-Sync Display Compliance Test Specification (Adaptive-Sync Display CTS), which it claims is the first open standard and logo program for desktop and laptop monitors. The logo compliance boils down to two specific tiers: MediaSync Display and Adaptive-Sync Display.

"The MediaSync Display logo performance tier is designed to ensure that displays meet a high level of quality optimized for media playback," states VESA. "This logo performance tier eliminates video frame dropping, and 3:2 pull-down jitter and other sources of jitter, while meeting its mandatory flicker performance level to make the display visually flicker free.

"The AdaptiveSync logo performance tier is optimized for gaming and designed for displays that have a sufficiently large variable video frame-rate range and low latency, while also supporting high-quality media playback with a similar set of benefits as the MediaSync Display logo performance tier."

VESA Adaptive-Sync Display CTS

(Image credit: VESA)

Before we hop into the specifics of Adaptive-Sync Display CTS, let's first look at what Adaptive-Sync aims to accomplish. With Adaptive-Sync (be it either G-Sync or FreeSync), the aim is to sync a monitor's refresh cycle with that of the connected graphics card. 

The graphics card "sets the pace," so to speak, by controlling the refresh rate continuously and syncing it with the monitor. With this mechanism in place, a monitor can fully draw each frame prior to the graphics card sending a new one. As a result, screen tearing or artifacts that can occur when the monitor and GPU aren't in sync are eliminated.

VESA's Adaptive-Sync Display CTS is standardized across over 50 "automated display performance" tests to benchmark refresh rate, flicker, gray-to-gray response times, frame-rate jitter, and frame drops (among other variables). In addition, all monitors seeking to gain Adaptive Sync Display or MediaSync Display logo certification will be tested in their default configuration as shipped from the factory to level the playing field.

To obtain the Adaptive-Sync Display logo, the absolute minimum Adaptive-Sync refresh range is 60 Hz, while the maximum range is at least 144 Hz. On the other hand, the MediaSync logo requires an Adaptive-Sync range of 48 Hz to at least 60 Hz. Regarding the Adaptive-Sync Display certification, a performance tier denotes the maximum frame rate sustainable by a monitor. For example, you might see monitors labeled as Adaptive-Sync Display 240 or Adaptive-Sync 360 to denote 240 Hz and 360 Hz maximums, respectively.

VESA also specs less than 1ms of jitter across ten common standards for frame rate: 23.976, 24, 25, 29.97, 30, 47.952, 48, 50, 59.94 and 60 Hz. Regarding response times, the gaming-centric Adaptive-Sync Display tier is spec'd for equal to or less than 5 ms (gray-to-gray). That 5 ms figure is averaged over 20 tests, so display manufacturers can't pick and choose the performance metrics that fit their narratives.

In keeping with the refresh rate standards, VESA also attempts to crack down on factory overdrive settings, which can often introduce unsightly visual artifacts when gaming. As a result, VESA's testing requires that Adaptive-Sync Display and MediaSync Display monitors adhere to less than a 20 percent overshoot and less than 15 percent undershoot across 16 tests. As a result, manufacturers that have traditionally shipped their monitors with overly aggressive overdrives will likely have to dial things back to meet VESA Adaptive-Sync Display CTS compliance.

And because temperatures while testing can significantly impact display behavior, VESA specifies that testing is performed at a room temperature between 72.5 degrees and 76 degrees. You can view all of VESA’s requirements here

"The Adaptive-Sync Display CTS builds upon the foundation that VESA laid with the introduction of the Adaptive-Sync protocols eight years ago," said Roland Wooster, chairman of the VESA Display Performance Metrics Task Group. "It provides an open, industry-wide and brand-agnostic standard backed by a logo program that gives consumers a guarantee that the displays that they're buying for gaming or for media playback will meet a clearly defined minimum set of front-of-screen performance criteria when used with a suitable GPU. In designing the test specification and logo program, VESA explicitly set a high bar on performance criteria and testing methodology with tighter criteria than many existing specs and logo programs."

So, what does this mean for you, the consumer? In the near term, you'll likely see monitors that feature G-Sync or FreeSync logo in addition to Adaptive-Sync Display or MediaSync Display logos. However, with the stringent set of requirements needed to achieve these new certifications, we could come to a point where G-Sync and FreeSync branding is abandoned entirely in favor of Adaptive-Sync Display or MediaSync Display. The two specifications from Nvidia and AMD are already largely interchangeable, so consolidation of standards should make things more transparent for consumers. 

And that's the whole point of Adaptive-Sync Display CTS and its new logo certifications. Greater transparency while laying out a set of stringent tests for monitors is good news for the industry in general. Unfortunately, only two compliant monitors have been officially announced since we're starting from the ground floor here with Adaptive-Sync Display CTS: the LG UltraGear 27GP950 and 27GP850. However, we should expect that number to grow significantly over the coming months and years.

Brandon Hill

Brandon Hill is a senior editor at Tom's Hardware. He has written about PC and Mac tech since the late 1990s with bylines at AnandTech, DailyTech, and Hot Hardware. When he is not consuming copious amounts of tech news, he can be found enjoying the NC mountains or the beach with his wife and two sons.

  • hotaru.hino
    If the testing results in implementations of VRR that are as good or better than either of those two, then sure. But considering how long G-Sync has survived and is still being included on higher end monitors, I don't think that'll go away any time soon.

    Though honestly VESA should've done this from the start, or at least the branding of it.
    Reply
  • King_V
    Agreed overall, but I do think this is, ultimately, going to push G-Sync aside.

    Why spend the extra money on the production side, or the extra money on the consumer's price side, for something that doesn't give any extra benefit?
    Reply
  • tennis2
    Ever since Nvidia quit with the nonsense GSync modules (admittedly necessary to achieve the functionality for the ~1 year before significant market adoption of VRR, hence FreeSync), they've been using VESA adaptive sync , right? The whole "GSync Compatible" charade was just Nvidia's way of directing revenue to their most loyal monitor manufacturers, when in reality, the vast majority of VESA adaptive sync monitors could work on Nvidia 10xx and higher GPUs over DP from the (Jan 2019) start.

    FreeSync was always VESA adaptive sync AFAIK, but with the additional legwork by AMD of including VRR over HDMI (which is now a non-issue for monitors that have HDMI 2.1)

    Honestly, it seems long overdue for everything to just fall under VESA adaptive sync.
    Reply
  • InvalidError
    hotaru.hino said:
    Though honestly VESA should've done this from the start, or at least the branding of it.
    The main reason we got the Adaptive Sync rebrand nonsense is AMD choosing to house-brand Adaptive Sync for marketing purposes against Nvidia's proprietary Gsync instead of promoting the generic name that Nvidia could pick up at any time.
    Reply
  • hotaru.hino
    tennis2 said:
    Ever since Nvidia quit with the nonsense GSync modules (admittedly necessary to achieve the functionality for the ~1 year before significant market adoption of VRR, hence FreeSync), they've been using VESA adaptive sync , right? The whole "GSync Compatible" charade was just Nvidia's way of directing revenue to their most loyal monitor manufacturers, when in reality, the vast majority of VESA adaptive sync monitors could work on Nvidia 10xx and higher GPUs over DP from the (Jan 2019) start.
    G-Sync offers a few features that FreeSync doesn't provide, and BlurBusters have noticed that G-Sync tends to provide a better experience. Also NVIDIA likely has a stringent qualification process because G-Sync is posited as a premium feature. However on monitors themselves, I believe the G-Sync module is responsible for driving the display with regards to timings, pixel overdriving, and whatnot, and is not simply just something there to facilitate commands from an NVIDIA GPU.

    Though on laptops it's different. It's actually using VESA standards, but NVIDIA is probably still making laptop makers go through a more stringent qualification before they can slap on the G-Sync badge.

    InvalidError said:
    The main reason we got the Adaptive Sync rebrand nonsense is AMD choosing to house-brand Adaptive Sync for marketing purposes against Nvidia's proprietary Gsync instead of promoting the generic name that Nvidia could pick up at any time.
    I subscribe to the "conspiracy" that AMD's marketers were hoping to bank on people's ignorance, riding on another group's work so people think AMD invented it or something and released it royalty free for PR points.
    Reply
  • parkerthon
    King_V said:
    Agreed overall, but I do think this is, ultimately, going to push G-Sync aside.

    Why spend the extra money on the production side, or the extra money on the consumer's price side, for something that doesn't give any extra benefit?

    I think there's a actually a big difference between gsync certified and "compatible" monitor's that are otherwise amd certified which is fairly flexible and easy.

    I don't know if it's a GSync thing, but the adaptive refresh on G Sync has been a superior experience for me vs monitors with GSync compatible version. In my experience, if a game a has micro stutter or any other kind of hitching issue, which is a not uncommon issue, having gsync enabled on a "compatible" monitor is a poor experience. I found buying a gsync premium monitor made the issue go away even if this is cost prohibitive for many. Apparently has to do with the floor refresh rate being too high as microstutter, per my recollection researching the issue at time, causes your fps to drop very quickly before rebounding. That's my only concern about this certification. That now we get an inferior certification(eg I feel 48hz min is too high as it should be minimum 24hz) which will quickly become a base standard and nothing more(similar to how HDR has evolved over the years into competing "premium" standards). I do see the potential for better validation of claimed performance specs by different display vendors, however that doesn't help me much as I never buy anything without seeing someone knowledgeable and geared up measure hard specs on a random sample unit first. So all in all, the standard is probably only an improvement for consumers shopping at the lower end of the gaming monitor segment which is cool all the same. I just wish they offered a premium set of specs as well and also that they refresh the standard frequently.

    Meanwhile application performance is the other side of this puzzle. I'd imagine if this standard was, well, actually a widely adopted standard, developers would test more thoroughly for issues that would impact performance on compatible displays. I could see far less sync related issues in games especially with indie games and new releases. So that's pretty good upside as well.
    Reply
  • -Fran-
    InvalidError said:
    The main reason we got the Adaptive Sync rebrand nonsense is AMD choosing to house-brand Adaptive Sync for marketing purposes against Nvidia's proprietary Gsync instead of promoting the generic name that Nvidia could pick up at any time.
    This is an interesting hot take on the events.

    Then why did nVidia create the "tiers" of GSync instead of just saying "VRR compatible" or something? The fact AMD created FreeSync was to market their own "standard measurement" for the monitor quality around VRR. You can make any judgement on it's quality, but nVidia did the same exact thing and went a step further with the module so monitors could pass all their criteria for their VRR implementation.

    Could AMD have pushed VESA to get this out faster? Maybe. Could have nVidia done it as well? Absolutely. Was it in either's business best interest? LOL, no and you know it.

    Don't go giving that hot take without understanding how both AMD and nVidia didn't want to have VESA get this out too fast/soon, since they'd have to push for either a better certification (so it actually means squat), or just phase the branding out after investing in marketing for them.

    TL;DR: you pinning this on AMD is disingenuous, to say the least.

    Regards.
    Reply
  • Sergei Tachenov
    What about variable overdrive?

    It’s no secret that the best overdrive settings differ for different refresh rates. This is especially true for high refresh monitors.

    Native G-Sync monitors adjust overdrive automatically, so I don’t care whether my game runs at 60 FPS, 90 FPS or 200 FPS. With a G-Sync Compatible monitor I’ll either have to pick a compromise setting or change the setting manually. There are some G-Sync Compatible monitors that try to imitate variable OD, but they do it by simply automatically switching between the OD presets, which leads to very poor experience if a game’s frame rate happens to constantly fluctuate from one preset range to another.

    That being said, however, on high refresh monitors screen tearing is next to invisible anyway, so maybe we should just ditch VRR completely once mainstream monitors move into the 200+ Hz territory?
    Reply
  • hotaru.hino
    Sergei Tachenov said:
    That being said, however, on high refresh monitors screen tearing is next to invisible anyway, so maybe we should just ditch VRR completely once mainstream monitors move into the 200+ Hz territory?
    VRR isn't solely to prevent screen tearing. It can also make the system more efficient. Sure it might not matter for a desktop, but for a laptop, tablet, or phone, every watt counts.
    Reply
  • InvalidError
    -Fran- said:
    TL;DR: you pinning this on AMD is disingenuous, to say the least.
    The only thing I'm pinning on AMD is the completely unnecessary FreeSync branding as the basic tier requirements add absolutely nothing to baseline Adaptive Sync besides AMD's marketing blessing.

    Nvidia's Gsync came to market over a year before Adaptive Sync became a standard and requires proprietary scaler hardware in the monitor that adds ~$200 to cost to monitors that will only ever work as intended with Nvidia GPUs.
    Reply