AMD has been making lots of noise about Project FreeSync these past few months, but has also left plenty of questions unanswered. We’ve all been curious about how the industry is responding to AMD's FreeSync efforts, so we asked. It’s no surprise that AMD is confident when it comes to predicting the success of those efforts, especially based on its purported cooperation with major scaler players, but if the company’s optimism is met with any semblance of reality, we’ll see widespread support for Adaptive-Sync in monitors with DisplayPort interfaces in the not-so-distant future.
As a refresher for the uninitiated, Project FreeSync is AMD's effort to get mainstream adoption of the VESA Adaptive-Sync specification, which is implemented in DisplayPort 1.2a. Using this, a graphics card can work with supporting monitors to eliminate tearing and stuttering in games. Before Adaptive-Sync, you would have to live with these artifacts, or, if you enabled V-Sync, you could eliminate the tearing but would suffer even greater stuttering.
Nvidia has a similar, proprietary technology that it calls G-Sync, which we've tested and it works.
Another quick refresher: The Adaptive-Sync protocol works with variable vertical-blank intervals. After a frame is pushed to the display, the graphics card sends out a v-blank start signal, telling the monitor to keep the given image on-screen. Just as the graphics card finishes rendering the next frame, it will send out a v-blank end signal, telling the monitor to scan for information again, for which it receives the new frame and updates the image, and the process repeats.
The benefits: primarily a smoother experience for the user, with less perceived stuttering and uneven frame playback. The technology is primarily for gamers, but could be used to avoid inverse telecine artifacts for 24 FPS video. Adaptive-Sync comes with almost no latency; sending out these signals hardly costs any power at all; and there is effectively no performance overhead.
What’s more, this can be done with any monitor that supports Adaptive-Sync (in other words, the monitor uses a supporting scalar), since the graphics card tells the monitor what to do. No two-way handshake is needed for every frame when using Adaptive-Sync because the graphics card and the monitor will agree upon the minimum and maximum frame intervals upon connecting. Of course, this does require a DisplayPort link.
There are three or four major vendors that build scalers for most of the leading monitor vendors, according to an AMD spokesman. AMD is actively working with these scaler vendors to bring Adaptive-Sync support into their higher-end scalers, and the company expects the mainstream scalers to gain support for the features very soon as well. The process isn’t progressing as slowly as we thought, although the AMD spokesman did make it clear that it was still too early to discuss exact commitments. A number of monitor vendors are reaching out to AMD to get them to assist with implementing the technology, while others aren’t entirely convinced yet, the spokesman said.
It’s reasonable to expect this functionality to show up in non-gaming oriented monitors, like high-resolution IPS monitors. In fact, the monitor we saw running Adaptive-Sync at Computex 2014 was an IPS monitor with a resolution of 2560 x 1440 pixels (we came by this information on our own). Although AMD won't reveal the manufacturer, that monitor can be purchased today in retail stores--just with different firmware that supports Adaptive-Sync. However, don’t expect monitor vendors to start updating the firmware of existing products.
Nevertheless, this is good news. We can expect to see almost all mainstream and high-end monitors support Adaptive-Sync in the future. Last month, during Computex, AMD announced that we should see Adaptive-Sync monitors on the market within 6-12 months. If so, Adaptive-Sync could have a leg up on G-Sync, at least in terms of speed to market (we’ll let our testing speak to technological superiority when the time is right). G-Sync also operates over DisplayPort, but with a proprietary protocol, which will probably limit its support.
AMD is unlikely to support similar technology over HDMI, according to the spokesman, who said: "If we wanted to do something over HDMI right now, it would have to be proprietary, and we would rather not do that." Perhaps a not-so-subtle knock at Nvidia.