FreeSync: AMD's Approach To Variable Refresh Rates
AMD's FreeSync technology is gaining momentum, but is the buzz warranted? We did our own research, worked with Acer and spoke with AMD to find out.
Conclusion
Our prediction on the evolution of variable-refresh-rate technologies
Given the dramatically different approaches Nvidia and AMD are taking, the whole industry is going through an almost textbook market case study. Will the custom-built, higher-priced Nvidia solution win in the long run, or will the open standard-based industry strategy AMD is using see wider adoption?
One thing is certain: The standards battle is further polarizing the whole display chain purchasing process. You will no longer buy a GPU on its own merits. From now on, you will need to carefully think about what GPU/display combination is right for you, as FreeSync and G-Sync displays are not fully interoperable with the opposing GPU vendor. They will work, but you lose the variable refresh rate feature.
If we were to make a prediction, we'd say G-Sync is likely to see greater adoption in the next year or so, both being first to market and more tightly controlled than FreeSync. As FreeSync improves and its adoption picks up, however, the benefits of a higher-priced G-Sync custom solution will erode. By late 2016 or so, FreeSync should offer similar consistency at a much lower price. Nvidia may then try to switch out FPGAs for application-specific integrated circuits (ASICs) to lower the price of G-Sync, but will then likely be forced to sit at the VESA table and the industry will eventually converge on a common (non-optional) standard.
One big push for standardization will come from display OEMs. Having both FreeSync and G-Sync variants of displays forces display OEMs to carry much higher inventories than a single variant for each product. For as long as sales volumes for these new technologies are small, that will not be an issue. As the technologies become increasingly mainstream, however, it will increasingly be a matter OEMs will grumble about.
Until true standardization happens, it will be Betamax versus VHS or Blu-ray vs. HD-DVD all over again. You'll need to make a choice that suits you, knowing that nothing is guaranteed to be future-proof in a standards war scenario.
Either way, we're really glad that AMD's FreeSync is gaining momentum. We've tried it hands-on, and it works just as well as G-Sync. Choice is great for consumers, and we need more of it in this space. When that choice includes approximately $200 in savings on a new technology that actually has real benefits, we're happy to say well-done AMD. Keep it up.
MORE: Best Graphics Cards For The Money
MORE: All Graphics Articles
Filippo L. Scognamiglio Pasini is a Contributing Writer for Tom's Hardware, covering Graphics. Follow him on Twitter.
Follow Tom's Hardware on Twitter, on Facebook and on Google+.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
DbD2 Imo freesync has 2 advantages over gsync:Reply
1) price. No additional hardware required makes it relatively cheap. Gsync does cost substantially more.
2) ease of implementation. It is very easy for a monitor maker to do the basics and slap a freesync sticker on a monitor. Gsync is obviously harder to add.
However it also has 2 major disadvantages:
1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.
2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it. -
InvalidError It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.Reply -
xenol I still think the concept of V-Sync must die because there's no real reason for it to exist any more. There are no displays that require precise timing that need V-Syncing to begin with. The only timing that should exist is the limit of the display itself to transition to another pixel.Reply
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
Especially if it's supposedly an "open" standard. -
nukemaster It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
It was kind of the highlight of the article.
(Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)It is a shame that AMD is not pushing for some more standardization on these freesync enabled displays. A competition to ULMB would also be nice to see for games that already have steady frame rates. -
jkhoward Of course you think NVIDIA solution will win. You always do. This forum is becoming more and more bias.Reply -
InvalidError
As stated in the article, modern LCDs still require some timing guarantees to drive pixels since the panel parameters to switch pixels from one brightness to another change depending on the time between refreshes. If you refresh the display at completely random intervals, you get random color errors due to fade, over-drive, under-drive, etc.16718305 said:I still think the concept of V-Sync must die because there's no real reason for it to exist any more.
While LCDs may not technically require vsync in the traditional CRT sense where it was directly related to an internal electromagnetic process, they still have operational limits on how quickly, slowly or regularly they need to be refreshed to produce predictable results.
It would have been more technically correct to call those limits min/max frame times instead of variable refresh rates but at the end of the day, the relationship between the two is simply f=1/t, which makes them effectively interchangeable. Explaining things in terms of refresh rates is simply more intuitive for gamers since it is almost directly comparable to frames per second. -
Freesync will clearly win, as a $200 price difference isn't trivial for most of us.Reply
Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology. -
dwatterworth A suggestion for the high end frame rate issue with FreeSync, turn on FRTC and set it's maximum rate to the top end of the monitors sync range.Reply -
TechyInAZ Very interesting read. I never knew that variable refresh rates had effects on light strobing.Reply
I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?