AMD has been making lots of noise about Project FreeSync these past few months, but has also left plenty of questions unanswered. We’ve all been curious about how the industry is responding to AMD's FreeSync efforts, so we asked. It’s no surprise that AMD is confident when it comes to predicting the success of those efforts, especially based on its purported cooperation with major scaler players, but if the company’s optimism is met with any semblance of reality, we’ll see widespread support for Adaptive-Sync in monitors with DisplayPort interfaces in the not-so-distant future.
As a refresher for the uninitiated, Project FreeSync is AMD's effort to get mainstream adoption of the VESA Adaptive-Sync specification, which is implemented in DisplayPort 1.2a. Using this, a graphics card can work with supporting monitors to eliminate tearing and stuttering in games. Before Adaptive-Sync, you would have to live with these artifacts, or, if you enabled V-Sync, you could eliminate the tearing but would suffer even greater stuttering.
Nvidia has a similar, proprietary technology that it calls G-Sync, which we've tested and it works.
Another quick refresher: The Adaptive-Sync protocol works with variable vertical-blank intervals. After a frame is pushed to the display, the graphics card sends out a v-blank start signal, telling the monitor to keep the given image on-screen. Just as the graphics card finishes rendering the next frame, it will send out a v-blank end signal, telling the monitor to scan for information again, for which it receives the new frame and updates the image, and the process repeats.
The benefits: primarily a smoother experience for the user, with less perceived stuttering and uneven frame playback. The technology is primarily for gamers, but could be used to avoid inverse telecine artifacts for 24 FPS video. Adaptive-Sync comes with almost no latency; sending out these signals hardly costs any power at all; and there is effectively no performance overhead.
What’s more, this can be done with any monitor that supports Adaptive-Sync (in other words, the monitor uses a supporting scalar), since the graphics card tells the monitor what to do. No two-way handshake is needed for every frame when using Adaptive-Sync because the graphics card and the monitor will agree upon the minimum and maximum frame intervals upon connecting. Of course, this does require a DisplayPort link.
There are three or four major vendors that build scalers for most of the leading monitor vendors, according to an AMD spokesman. AMD is actively working with these scaler vendors to bring Adaptive-Sync support into their higher-end scalers, and the company expects the mainstream scalers to gain support for the features very soon as well. The process isn’t progressing as slowly as we thought, although the AMD spokesman did make it clear that it was still too early to discuss exact commitments. A number of monitor vendors are reaching out to AMD to get them to assist with implementing the technology, while others aren’t entirely convinced yet, the spokesman said.
It’s reasonable to expect this functionality to show up in non-gaming oriented monitors, like high-resolution IPS monitors. In fact, the monitor we saw running Adaptive-Sync at Computex 2014 was an IPS monitor with a resolution of 2560 x 1440 pixels (we came by this information on our own). Although AMD won't reveal the manufacturer, that monitor can be purchased today in retail stores--just with different firmware that supports Adaptive-Sync. However, don’t expect monitor vendors to start updating the firmware of existing products.
Nevertheless, this is good news. We can expect to see almost all mainstream and high-end monitors support Adaptive-Sync in the future. Last month, during Computex, AMD announced that we should see Adaptive-Sync monitors on the market within 6-12 months. If so, Adaptive-Sync could have a leg up on G-Sync, at least in terms of speed to market (we’ll let our testing speak to technological superiority when the time is right). G-Sync also operates over DisplayPort, but with a proprietary protocol, which will probably limit its support.
AMD is unlikely to support similar technology over HDMI, according to the spokesman, who said: "If we wanted to do something over HDMI right now, it would have to be proprietary, and we would rather not do that." Perhaps a not-so-subtle knock at Nvidia.
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Niels Broekhuijsen is a Contributing Writer for Tom's Hardware US. He reviews cases, water cooling and pc builds.
Although ASync only works with 2 AMD cards at the moment. Not sure how that makes it more open to the public. I am sure AMD will try and expand the card list to support Async in the future, but for now it is even more limited than Nvidia's solution.Reply
I wouldn't call a limited listing of current cards as an issue considering the 6-12 month time before monitors that support it. That and given most people don't replace their monitors very often. I think by the time these monitors are commonplace AMD's full line of GPU's will likely support it.Reply
amd late to the game again...Reply
The best thing would be to have Intel to support this feature with their next CPUGPU upgrades. It should be possible to do it just by making new drivers, and those CPUGPU solutions really need this feature, because they can have really poor frame rates.Reply
I am certain AMD wants this to work on AMD, NVidia, and Intel GPUs. Most proprietary solutions fail in the marketplace without the company shelling out cash for manufacturers/developers to adopt. After all no point limiting your market to 50% of users. More than likely we will see support for AMDs current architecture and NVidia since Kepler.Reply
No special hardware needed, no extra cost onto the monitor...If this works as expected, who would ever spend extra for gsync, and then be pigeon-holed into only using nVidia's GPU's.Reply
Although ASync only works with 2 AMD cards at the moment. Not sure how that makes it more open to the public. I am sure AMD will try and expand the card list to support Async in the future, but for now it is even more limited than Nvidia's solution.
R9 290X, R9 290, R7 260X and R7 260 graphics cards support it. Their APUs also support. A lot of people are already able to benefit when it comes out.
Of course it is limited since its not out yet. When it is out though, its over for Gsync. If nvidia doesn't support this tech because of their stupid EGO or try to fight it, they will suffer - if reviews show it essential for a wide range of gaming levels (low to high to ultra high).
The best thing would be to have Intel to support this feature with their next CPUGPU upgrades. It should be possible to do it just by making new drivers, and those CPUGPU solutions really need this feature, because they can have really poor frame rates.
Yea and Intel has so far been really good about trying to update features on their graphics chips. Granted they only have a small handful that are worth using for anything which makes it easier to update as opposed to the massive holdings AMD and Nvidia have so many cards to update. So I bet Intel would be happy about the technology and support it. It would be good to see them get into it.
i love amd, but if it wasn't for nvidias money-mongering, this would have never come to fruitionReply
i love amd, but if it wasn't for nvidias money-mongering, this would have never come to fruition
Not necesarily. This issue has been present for pretty much the full history of computer gaming. That is why V-Sync was originally created. It seems more likely both companies were working on the same problem and have came up with different solutions. Nvidia created a solution that would make them more money. AMD created a solution which would improve their performance. It probably helps that having this, will prevent loss of sales from people wanting to use G-Sync, but I doubt that was their primary reason.