Enter FreeSync And G-Sync
Although a variable refresh-rate standard had long existed for the mobile market (mostly for power-saving benefits), Nvidia was the first to realize the potential of introducing variable refresh rates to desktop gaming-oriented LCDs. The company's solution launched as a proprietary "closed" system dubbed G-Sync. AMD followed suit by announcing "FreeSync," which hinged on an optional standard by VESA under the name "Adaptive-Sync".
The main difference between FreeSync and Adaptive-Sync is that Adaptive-Sync is, strictly speaking, just a DisplayPort protocol addition, whereas FreeSync involves the whole render chain (GPU, DisplayPort protocol and display). It would be correct to say that Adaptive-Sync is a component of FreeSync, or that FreeSync builds and expands upon Adaptive-Sync.
Both AMD's FreeSync and Nvidia's G-Sync operate by manipulating some features of the DisplayPort stream to enable a variable Vblank interval that, in turn, results in a variable refresh rate.
One key difference between FreeSync and G-Sync is that Nvidia actually went through the process of designing a custom LCD scaler (based on an expensive component called a field-programmable gate array or FPGA), whereas AMD entered into agreements with leading scaler manufacturers to support FreeSync in their future products. This difference is very important. The broad implication is that Nvidia will have tighter control of G-Sync's operation, but needs to price its solution higher than AMD's. Because of the components involved, G-Sync displays cost, and will likely continue to cost, $150 to $200 more than their AMD counterparts.
Because of the way FreeSync was established, AMD has little control over how display manufacturers implement the technology. This led to initial quality control issues, such as the flicker many users reported when using the DisplayPort cable and firmware provided with Acer's XG270HU. Apparently, these issues were fixed by the latest display firmware.
To help mitigate problems like that, AMD said it established a QC process to determine how displays become eligible for the FreeSync brand/logo. Unfortunately, AMD declined to disclose what quality standards will be used to establish a "pass" for display manufacturers. You'll have to take AMD's word that future FreeSync-branded monitors will not demonstrate flickering or artifacts, but you'll still need to take up any issues with the display manufacturer.
To be fair, we should note that G-Sync is not entirely flicker-free either. The most annoying limitation right now with both FreeSync and G-Sync is the flickering in menu and loading screens. Try playing Pillars of Eternity with either technology enabled. It's not fun. Hopefully, future driver updates mitigate the artifacts for both vendors.
We covered Nvidia's G-Sync technology extensively in an earlier article: G-Sync Technology Preview: Quite Literally A Game Changer. Today, we'll go over FreeSync in detail and highlight how it differs from G-Sync.
If you'd like to see a proper blind test of the two technologies, I'd also encourage you to read our testing event, AMD FreeSync Versus Nvidia G-Sync: Readers Choose.
1) price. No additional hardware required makes it relatively cheap. Gsync does cost substantially more.
2) ease of implementation. It is very easy for a monitor maker to do the basics and slap a freesync sticker on a monitor. Gsync is obviously harder to add.
However it also has 2 major disadvantages:
1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.
2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it.
Especially if it's supposedly an "open" standard.
(Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)It is a shame that AMD is not pushing for some more standardization on these freesync enabled displays. A competition to ULMB would also be nice to see for games that already have steady frame rates.
While LCDs may not technically require vsync in the traditional CRT sense where it was directly related to an internal electromagnetic process, they still have operational limits on how quickly, slowly or regularly they need to be refreshed to produce predictable results.
It would have been more technically correct to call those limits min/max frame times instead of variable refresh rates but at the end of the day, the relationship between the two is simply f=1/t, which makes them effectively interchangeable. Explaining things in terms of refresh rates is simply more intuitive for gamers since it is almost directly comparable to frames per second.
Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.
I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?