FreeSync: AMD's Approach To Variable Refresh Rates

Enter FreeSync And G-Sync

Although a variable refresh-rate standard had long existed for the mobile market (mostly for power-saving benefits), Nvidia was the first to realize the potential of introducing variable refresh rates to desktop gaming-oriented LCDs. The company's solution launched as a proprietary "closed" system dubbed G-Sync. AMD followed suit by announcing "FreeSync," which hinged on an optional standard by VESA under the name "Adaptive-Sync".

The main difference between FreeSync and Adaptive-Sync is that Adaptive-Sync is, strictly speaking, just a DisplayPort protocol addition, whereas FreeSync involves the whole render chain (GPU, DisplayPort protocol and display). It would be correct to say that Adaptive-Sync is a component of FreeSync, or that FreeSync builds and expands upon Adaptive-Sync.

Both AMD's FreeSync and Nvidia's G-Sync operate by manipulating some features of the DisplayPort stream to enable a variable Vblank interval that, in turn, results in a variable refresh rate.

One key difference between FreeSync and G-Sync is that Nvidia actually went through the process of designing a custom LCD scaler (based on an expensive component called a field-programmable gate array or FPGA), whereas AMD entered into agreements with leading scaler manufacturers to support FreeSync in their future products. This difference is very important. The broad implication is that Nvidia will have tighter control of G-Sync's operation, but needs to price its solution higher than AMD's. Because of the components involved, G-Sync displays cost, and will likely continue to cost, $150 to $200 more than their AMD counterparts.

Because of the way FreeSync was established, AMD has little control over how display manufacturers implement the technology. This led to initial quality control issues, such as the flicker many users reported when using the DisplayPort cable and firmware provided with Acer's XG270HU. Apparently, these issues were fixed by the latest display firmware.

To help mitigate problems like that, AMD said it established a QC process to determine how displays become eligible for the FreeSync brand/logo. Unfortunately, AMD declined to disclose what quality standards will be used to establish a "pass" for display manufacturers. You'll have to take AMD's word that future FreeSync-branded monitors will not demonstrate flickering or artifacts, but you'll still need to take up any issues with the display manufacturer.

To be fair, we should note that G-Sync is not entirely flicker-free either. The most annoying limitation right now with both FreeSync and G-Sync is the flickering in menu and loading screens. Try playing Pillars of Eternity with either technology enabled. It's not fun. Hopefully, future driver updates mitigate the artifacts for both vendors.

We covered Nvidia's G-Sync technology extensively in an earlier article: G-Sync Technology Preview: Quite Literally A Game Changer. Today, we'll go over FreeSync in detail and highlight how it differs from G-Sync.

If you'd like to see a proper blind test of the two technologies, I'd also encourage you to read our testing event, AMD FreeSync Versus Nvidia G-Sync: Readers Choose.

  • hatib
    more expensive but gr8 nvidia gsync i like it
    Reply
  • DbD2
    Imo freesync has 2 advantages over gsync:
    1) price. No additional hardware required makes it relatively cheap. Gsync does cost substantially more.

    2) ease of implementation. It is very easy for a monitor maker to do the basics and slap a freesync sticker on a monitor. Gsync is obviously harder to add.

    However it also has 2 major disadvantages:
    1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.

    2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it.
    Reply
  • InvalidError
    It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
    Reply
  • xenol
    I still think the concept of V-Sync must die because there's no real reason for it to exist any more. There are no displays that require precise timing that need V-Syncing to begin with. The only timing that should exist is the limit of the display itself to transition to another pixel.

    It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
    Especially if it's supposedly an "open" standard.
    Reply
  • nukemaster
    It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
    It was kind of the highlight of the article.

    (Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)It is a shame that AMD is not pushing for some more standardization on these freesync enabled displays. A competition to ULMB would also be nice to see for games that already have steady frame rates.
    Reply
  • jkhoward
    Of course you think NVIDIA solution will win. You always do. This forum is becoming more and more bias.
    Reply
  • InvalidError
    16718305 said:
    I still think the concept of V-Sync must die because there's no real reason for it to exist any more.
    As stated in the article, modern LCDs still require some timing guarantees to drive pixels since the panel parameters to switch pixels from one brightness to another change depending on the time between refreshes. If you refresh the display at completely random intervals, you get random color errors due to fade, over-drive, under-drive, etc.

    While LCDs may not technically require vsync in the traditional CRT sense where it was directly related to an internal electromagnetic process, they still have operational limits on how quickly, slowly or regularly they need to be refreshed to produce predictable results.

    It would have been more technically correct to call those limits min/max frame times instead of variable refresh rates but at the end of the day, the relationship between the two is simply f=1/t, which makes them effectively interchangeable. Explaining things in terms of refresh rates is simply more intuitive for gamers since it is almost directly comparable to frames per second.
    Reply
  • Freesync will clearly win, as a $200 price difference isn't trivial for most of us.

    Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.
    Reply
  • dwatterworth
    A suggestion for the high end frame rate issue with FreeSync, turn on FRTC and set it's maximum rate to the top end of the monitors sync range.
    Reply
  • TechyInAZ
    Very interesting read. I never knew that variable refresh rates had effects on light strobing.

    I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?

    Reply