AMD's FreeSync technology is gaining momentum, but is the buzz warranted? We did our own research, worked with Acer and spoke with AMD to find out.
Human ingenuity works in incredible but mysterious ways. We somehow managed to put a man on the moon (1969) before realizing that adding wheels to luggage was a good idea (Sadow's patent, 1970). In similar (although maybe not as spectacular) fashion, it took more than a decade after the introduction of PC LCD displays for people to realize that there really was no reason for them to operate using a fixed refresh rate. This first page is dedicated to answering why fixed refresh rates on LCDs are even a thing. First, we need to explain how contemporary video signaling works. Feel free to skip ahead if you're not interested in a bit of PC history.
Back in the '80s, cathode ray tubes (CRTs) used in TVs needed a fixed refresh rate because they physically had to move an electron gun pixel by pixel, then line by line and, once they reached the end of the screen, re-position the gun at the beginning. Varying the refresh rate on the fly was impractical, at best. All of the supporting technology standards that emerged in the '80s, '90s and early '00s revolved around that necessity.
The most notable standard involved in controlling signaling from graphics processing units (GPUs) to displays is VESA's Coordinated Video Timings ("CVT," and also its "Reduced Blanking" cousins, "CVT-R" and "CVT-R2"), which, in 2002-2003, displaced the analog-oriented Generalized Timing Formula that had been the standard since 1999. CVT became the de facto signaling standard for both the older DVI and newer DisplayPort interfaces.
Like its predecessor, Generalized Timing Formula ("GTF"), CVT operates on a fixed "pixel clock" basis. The signal includes horizontal blanking and vertical blanking intervals, and horizontal frequency and vertical frequency. The pixel clock itself (which, together with some other factors, determines the interface bandwidth) is negotiated once and cannot easily be varied on the fly. It can be changed, though that typically makes the GPU and display go out of sync. Think about when you change your display's resolution in your OS, or if you've ever tried EVGA's "pixel clock overlocker."
Now, in the case of DisplayPort, the video stream attributes (together with other information used to regenerate the clock between the GPU and display) are sent as so-called "main stream attributes" every VBlank, that is, during each interval between frames.
LCDs were built around this technology ecosystem, and thus naturally adopted many related approaches: fixed refresh rates, pixel-by-pixel and line-by-line refreshing of the screen (as opposed to a single-pass global refresh) and so on. Also, for simplicity, LCDs historically had a fixed backlight to control brightness.
Fixed refresh rates offered other benefits for LCDs that have only more recently started to be exploited. Because the timing between each frame is known in advance, so-called overdrive techniques can be implemented easily, thus reducing the effective response time of the display (minimizing ghosting). Furthermore, LCD backlights could be strobed rather than set to always-on, resulting in reduced pixel persistence at a set level of brightness. Both technologies are known by various vendor-specific terms, but "pixel transition overdrive" and "LCD backlight strobing" can be considered the generic versions.
Why Are Fixed Display Refresh Rates An Issue?
GPUs inherently render frames at variable rates. Historically, LCDs have rendered frames at a fixed rate. So, until recently, only two options were available to frustrated PC gamers:
- Sync the GPU rate to the LCD rate and duplicate frames when necessary—so-called "turn v-sync on", which results in stuttering and lag.
- Do not sync the GPU rate to the LCD rate, and send updated frames mid-refresh—so-called "turn v-sync off", which results in screen tearing.
Without G-Sync or FreeSync, there simply was no solution to the above trade-off, and gamers are forced to choose between the two.
Enter FreeSync And G-Sync
Although a variable refresh-rate standard had long existed for the mobile market (mostly for power-saving benefits), Nvidia was the first to realize the potential of introducing variable refresh rates to desktop gaming-oriented LCDs. The company's solution launched as a proprietary "closed" system dubbed G-Sync. AMD followed suit by announcing "FreeSync," which hinged on an optional standard by VESA under the name "Adaptive-Sync".
The main difference between FreeSync and Adaptive-Sync is that Adaptive-Sync is, strictly speaking, just a DisplayPort protocol addition, whereas FreeSync involves the whole render chain (GPU, DisplayPort protocol and display). It would be correct to say that Adaptive-Sync is a component of FreeSync, or that FreeSync builds and expands upon Adaptive-Sync.
Both AMD's FreeSync and Nvidia's G-Sync operate by manipulating some features of the DisplayPort stream to enable a variable Vblank interval that, in turn, results in a variable refresh rate.
One key difference between FreeSync and G-Sync is that Nvidia actually went through the process of designing a custom LCD scaler (based on an expensive component called a field-programmable gate array or FPGA), whereas AMD entered into agreements with leading scaler manufacturers to support FreeSync in their future products. This difference is very important. The broad implication is that Nvidia will have tighter control of G-Sync's operation, but needs to price its solution higher than AMD's. Because of the components involved, G-Sync displays cost, and will likely continue to cost, $150 to $200 more than their AMD counterparts.
Because of the way FreeSync was established, AMD has little control over how display manufacturers implement the technology. This led to initial quality control issues, such as the flicker many users reported when using the DisplayPort cable and firmware provided with Acer's XG270HU. Apparently, these issues were fixed by the latest display firmware.
To help mitigate problems like that, AMD said it established a QC process to determine how displays become eligible for the FreeSync brand/logo. Unfortunately, AMD declined to disclose what quality standards will be used to establish a "pass" for display manufacturers. You'll have to take AMD's word that future FreeSync-branded monitors will not demonstrate flickering or artifacts, but you'll still need to take up any issues with the display manufacturer.
To be fair, we should note that G-Sync is not entirely flicker-free either. The most annoying limitation right now with both FreeSync and G-Sync is the flickering in menu and loading screens. Try playing Pillars of Eternity with either technology enabled. It's not fun. Hopefully, future driver updates mitigate the artifacts for both vendors.
We covered Nvidia's G-Sync technology extensively in an earlier article: G-Sync Technology Preview: Quite Literally A Game Changer. Today, we'll go over FreeSync in detail and highlight how it differs from G-Sync.
If you'd like to see a proper blind test of the two technologies, I'd also encourage you to read our testing event, AMD FreeSync Versus Nvidia G-Sync: Readers Choose.
Thanks to Bill Lempesis, Executive Director of VESA, we were able to both review the full DisplayPort 1.3 specification and preview a draft of the upcoming Adaptive‐Sync update, which is expected to be incorporated as an optional specification to the former standard. In May 2014, a similar addition was made to the earlier DisplayPort 1.2a standard—the first time that Adaptive-Sync was officially referenced as an industry standard.
Overall, the biggest catch in Adaptive-Sync is that it's optional. No VESA member is required to implement or support Adaptive-Sync. The certification routine is separate from that of DisplayPort itself. As of now, Adaptive-Sync does not even have its own logo! Having a display or GPU that exposes a DisplayPort connection is, in and of itself, no guarantee that it will support Adaptive-Sync. Sadly, as with all optional standards, consumers are bound to be confused.
Adaptive-Sync works by leveraging an optional DisplayPort feature. By telling the "sink" (the display) to "ignore main stream attributes," it can effectively use variable Vblank periods, creating a variable refresh rate.
Adaptive-Sync and the FreeSync standard built on top of it requires the specification of a certain range in which a display can operate on a variable refresh basis (for example, 30 to 144Hz). The range is set by the capabilities of the LCD panel and scaler, not by the GPU. The GPU is required to honor such a range requirement. Note that this range may not be, and typically is not, the full range of the display itself. For example, a display that can operate at fixed refresh rates of both 24Hz and 144Hz may opt to expose a variable rate of 35 to 90Hz.
The Adaptive-Sync standard does not cover how the system should behave when frame rates drop outside the supported range (below 30 frames per second or above 144 in a 30 to 144Hz range), except for implicitly stating that the refresh rate should not go above or below that range. It is left up to the GPU/driver combination to determine that.
All in all, the Adaptive-Sync standard is an exceptionally "light" standard. After all, the whole addendum is a mere two pages long. Especially because it's optional, it's unlikely that it will be the quick revolution wanted by gamers hoping for a rapid standardization of variable refresh rates.
Where It Gets Tricky
Two of the most exciting features found in modern LCDs are pixel transition overdrive and, in some of the newer panels, backlight strobing.
Both technologies dramatically improve the response time of LCD displays and lowering image persistence, significantly reducing the "ghosting" effect that has long plagued older LCD displays. If you're lucky enough to be younger than I am and have never played a first-person shooter on a CRT display, you ought to try one of these new strobing displays. They get really close to that experience, less the headache. Plus they weigh some 50 lbs. less. Then again, in this editor's tongue-in-cheek opinion, you're not a hardcore gamer until you've carried your own CRT display to a LAN party.
The image below is the best we've seen to characterize motion blur at different settings. Credit goes to the folks over at testufo.com.
Variable refresh rates pose an exceptional challenge to the two aforementioned technologies. The issue is that they've historically operated on the basis that the timing of frames was known. That is, in a fixed refresh ratio scenario, say 60Hz, every frame lasts 1/60 = 16.7 milliseconds. Therefore, voltages could be overdriven and backlights could be strobed while maintaining color consistency and a consistent global luminosity level.
Now, with FreeSync and G-Sync, the display does not know when the next frame will arrive or how long the current frame will be displayed. Consequently, it cannot easily overdrive pixel voltages without weird color outcomes. It cannot strobe on-demand, as display luminosity would vary in a maddening way as refresh rate varies dynamically. In order for these technologies to work, the display scaler would need to guess when the next frame will come and operate accordingly.
Display manufacturers only recently started to implement so-called variable refresh rate overdrive to help address this issue. We wrote about Nvidia's implementation in Nvidia's G-Sync Updates: Windowed Mode, Notebook Implementation, New Displays. Asus and its OEM partners added something similar to the upcoming MG279Q FreeSync display. Acer's XG270HU reportedly supports overdrive with FreeSync as well, but requires a firmware update and the version we tested didn't have it, unfortunately.
The operation principle of variable refresh rate overdrive is pretty simple. The display scaler guesses the next frame time based on previous frame times and varies the voltage overdrive setpoint accordingly (higher for shorter frame times, lower for longer frame times.) The worst that can happen, after all, is more ghosting than is ideal, and somewhat less accurate colors in scenes in motion. Either way, it works better than just disabling overdrive altogether.
The variable strobing challenge is more difficult to overcome from an engineering standpoint, and, at the time of this writing, no single OEM has even mentioned the possibility of variable pulse-width strobed displays. If low pixel persistence (that is, minimizing motion blur) is your highest concern, you will likely need to sacrifice G-Sync/FreeSync for months, if not years, to come.
Another challenge is windowed mode. On the desktop, there generally is no need for variable refresh rates. Where variable refresh rates come in handy is when you're playing 3D games or using 3D-accelerated applications. The former are usually played in full-screen mode, and display drivers are smart enough to recognize when a full-screen application takes over from Windows' Desktop Window Manager, and can thus enable the variable refresh rate.
But what if you want to play/operate in windowed mode? Windows' Desktop Window Manager is still handling desktop composition, and so you're stuck with fixed refresh rates. As matters stand, G-Sync enables variable refresh rate operation in windowed mode. AMD is looking into the capability, but hasn't estimated a target for its implementation.
My personal experience with G-Sync in windowed mode is a mixed bag at best. While some applications work better than others, Kerbal Space Program, for example, makes the entire desktop flicker to the point where I really wouldn't want G-Sync on at all. The flickering in other applications appears less extreme.
Interview With AMD
The following are excerpts from our conversation with Robert Hallock, head of global technical marketing at AMD:
Tom's Hardware: How are you feeling about the market momentum FreeSync is seeing?
Hallock: Great. We went from three monitors (actively sold in the market) in March to 19 in July. We've fixed all of the early issues that surfaced in the market through firmware or driver updates. We've introduced a quality certification program, and have already failed certain displays, to further guarantee the quality of displays sporting the FreeSync logo.
(Ed.: We were not satisfied with pure product range figures, so we tried asking both Newegg and Asus about actual sales volumes. Unfortunately, both companies declined to comment on relative sales volumes of G-Sync-equipped versus FreeSync-equipped displays.)
TH: What standards does a display sporting the FreeSync logo need to meet?
Hallock: Certain standards pertaining to, for example, a minimum variable refresh rate range of operation in FreeSync mode. We cannot disclose the details, for competitive reasons.
TH: The Adaptive-Sync standard you promoted, and VESA adopted as an optional standard, feels like a rather "light" standard. Is there an aspiration to expand that standard further?
Hallock: Adaptive-Sync as a VESA standard deals with the DisplayPort protocol only. FreeSync is the (higher order) standard, licensed free of charge to display OEMs, that deals with all other aspects of the technology.
TH: Why is the Asus MG279Q limited to a 35 to 90Hz variable refresh rate in FreeSync mode?
Hallock: That's a question for Asus. FreeSync, as a technology, supports a 9 to 240Hz native variable refresh rate range. It's important to not confuse limitations of FreeSync as a technology with the design choices that display OEMs choose to make, or to associate specific issues or limitations with individual products with the technology more broadly. For instance, the upcoming Nixeus Vue 24″ will support FreeSync over a 30 to 144Hz range.
TH: Will you be promoting and/or supporting frame time prediction in future displays (to support pixel overdrive and variable-timed strobing)?
Hallock: These are display-OEM choices. We are aware that Asus recently introduced pixel overdrive in combination with FreeSync. We are not aware of the details of Asus' implementation.
(Ed.: Hallock told us that one thing AMD will never do is buffer frames or do anything that introduces greater latency.)
TH: Will FreeSync support windowed mode?
Hallock: We are looking into (windowed functionality). We don't have a timetable for that yet.
Hands-On FreeSync Testing
This article wouldn't be complete unless we had a hands-on section, would it? With Acer's help, I had the chance to do some real-world testing of the company's XG270HU, which I paired with a Gigabyte R9 290X (GV-R929XOC-4GD 4GB) slotted into an aging but still extremely capable Core i7-950-based system running Windows 7 x64 and AMD's Catalyst display driver version 15.20.1062.
Acer's XG270HU is the first FreeSync display I've used extensively. I've had an Asus ROG Swift PG278Q (G-Sync) on my desk for several months now, so my expectations were based on that screen as a benchmark.
My first impression was less about FreeSync and more about the display itself. Although it supposedly sports the same base characteristics (144Hz TN panel), the Acer looked to be better-built, with much better color and viewing angles than Asus' offering. Conversely, Asus' on-screen display is, at least to me, much easier to use. The LED that changes color to coincide with status on the Asus (white/normal, red/G-Sync, green/3D) was a nice touch that Acer lacks. All in all, though, ignoring FreeSync and G-Sync for a moment, the Acer felt like the display I'd keep if I were forced to choose one. And that's significant, considering that Asus' monitor currently retails for $670 compared to the Acer's $500 price tag.
Setting up FreeSync was even easier than I expected. When I installed the latest Catalyst drivers, a pop-up window informed me that both my GPU and display supported FreeSync, then guided me through the control panel's FreeSync setup. A minute or two later, I was ready to go.
I spent hours getting almost seasick spinning a boat around in the waters of The Witcher 3 and going through the older sights of Columbia in BioShock: Infinite, all the while getting a general feel for how smooth the whole experience was, and for telltale signs of stuttering or tearing.
The simple conclusion: FreeSync worked great. Within its stated range of 40 to 144Hz, and even pushing the GPU close to the lower end of that range, the experience was as consistent as I had gotten used to with G-Sync. It's just amazing to see that AMD managed to pull this off at almost $200 below the prices of comparable solutions from Nvidia.
However, there are a few shortcomings worth mentioning.
At the higher end of the range, when the GPU hits 144+ frames per second, G-Sync actually limits the whole pipeline to 144 FPS (similar, but not quite the same as enabling v-sync). By contrast, FreeSync does not. Frame rates actually go above 144 (about 160+). While AMD touts this as a "feature" of FreeSync, the byproduct is actually highly undesirable: screen tearing comes back when frame rates go higher than 145 FPS, even with FreeSync enabled. Personally, I prefer the frame-limited G-Sync implementation in practice. For most modern games and a majority of scenarios, it won't matter. Not many systems sustain 145+ FPS at 2160p. But as we saw from Borderlands in our testing event in LA, exceptions do exist.
[Update . . . This is a clarifying statement from AMD: "Users have the option to turn v-sync on, which will get rid of tearing once going above the DRR range of the monitor. To clarify, while in the DRR range of the monitor, FreeSync will always have the right of way before v-sync, which means v-sync will only turn active once outside the range." Filippo no longer has a FreeSync monitor, so we're unable to verify this first hand.]
Also, although the colors were definitely more vivid on the Acer than on Asus' screen, I did feel that the Acer suffered from more ghosting. As it turns out, the display unit that we were provided did not have the latest firmware, and consequently was disabling pixel transition overdrive with FreeSync enabled, resulting in the ghosting I experienced. If you buy this display for its FreeSync capabilities, do make sure to get the latest firmware installed.
Followup With Acer
Here are a few follow-up questions I asked Acer, along with responses from the company:
TH: What is the chip model of the Realtek/Novatek/MStar/(other?) scaler in the display?
Acer: We don't provide this information, for competitive reasons.
(Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)
TH: Can you confirm the FreeSync range of the display?
Acer: The FreeSync range is 40 to 144Hz.
TH: Can you confirm that the display does not support backlight strobing?
Acer: It does not support backlight strobing.
TH: Is there any visual way to confirm that FreeSync is operating, other than just trusting that the option is checked in the Catalyst control center?
Acer: You can run the AMD FreeSync demo.
TH: Can you confirm that the overdrive setting is disabled in FreeSync mode (or, apparently, when using a DisplayPort cable in general with a GPU that supports FreeSync), whether that feature is enabled? I understand there should be a firmware update to address this issue. Does the display we used already include the latest firmware with the overdrive/FreeSync fix?
Acer: Users can adjust the OD setting in FreeSync mode. They can select Extreme/Normal/Off manually. The OD implementation is based on the scaler. AMD and our Original Device Manufacturer aligned response time values in FreeSync mode and implemented into the scaler design. AMD checked every Freesync monitor and worked with our product group during project development. We also went through AMD certification.
(Ed.: This is only true with the latest firmware installed. With the stock firmware, overdrive is effectively disabled when a DisplayPort connection is used, regardless of whether FreeSync is enabled or not)
Our prediction on the evolution of variable-refresh-rate technologies
Given the dramatically different approaches Nvidia and AMD are taking, the whole industry is going through an almost textbook market case study. Will the custom-built, higher-priced Nvidia solution win in the long run, or will the open standard-based industry strategy AMD is using see wider adoption?
One thing is certain: The standards battle is further polarizing the whole display chain purchasing process. You will no longer buy a GPU on its own merits. From now on, you will need to carefully think about what GPU/display combination is right for you, as FreeSync and G-Sync displays are not fully interoperable with the opposing GPU vendor. They will work, but you lose the variable refresh rate feature.
If we were to make a prediction, we'd say G-Sync is likely to see greater adoption in the next year or so, both being first to market and more tightly controlled than FreeSync. As FreeSync improves and its adoption picks up, however, the benefits of a higher-priced G-Sync custom solution will erode. By late 2016 or so, FreeSync should offer similar consistency at a much lower price. Nvidia may then try to switch out FPGAs for application-specific integrated circuits (ASICs) to lower the price of G-Sync, but will then likely be forced to sit at the VESA table and the industry will eventually converge on a common (non-optional) standard.
One big push for standardization will come from display OEMs. Having both FreeSync and G-Sync variants of displays forces display OEMs to carry much higher inventories than a single variant for each product. For as long as sales volumes for these new technologies are small, that will not be an issue. As the technologies become increasingly mainstream, however, it will increasingly be a matter OEMs will grumble about.
Until true standardization happens, it will be Betamax versus VHS or Blu-ray vs. HD-DVD all over again. You'll need to make a choice that suits you, knowing that nothing is guaranteed to be future-proof in a standards war scenario.
Either way, we're really glad that AMD's FreeSync is gaining momentum. We've tried it hands-on, and it works just as well as G-Sync. Choice is great for consumers, and we need more of it in this space. When that choice includes approximately $200 in savings on a new technology that actually has real benefits, we're happy to say well-done AMD. Keep it up.