Interview With AMD
The following are excerpts from our conversation with Robert Hallock, head of global technical marketing at AMD:
Tom's Hardware: How are you feeling about the market momentum FreeSync is seeing?
Hallock: Great. We went from three monitors (actively sold in the market) in March to 19 in July. We've fixed all of the early issues that surfaced in the market through firmware or driver updates. We've introduced a quality certification program, and have already failed certain displays, to further guarantee the quality of displays sporting the FreeSync logo.
(Ed.: We were not satisfied with pure product range figures, so we tried asking both Newegg and Asus about actual sales volumes. Unfortunately, both companies declined to comment on relative sales volumes of G-Sync-equipped versus FreeSync-equipped displays.)
TH: What standards does a display sporting the FreeSync logo need to meet?
Hallock: Certain standards pertaining to, for example, a minimum variable refresh rate range of operation in FreeSync mode. We cannot disclose the details, for competitive reasons.
TH: The Adaptive-Sync standard you promoted, and VESA adopted as an optional standard, feels like a rather "light" standard. Is there an aspiration to expand that standard further?
Hallock: Adaptive-Sync as a VESA standard deals with the DisplayPort protocol only. FreeSync is the (higher order) standard, licensed free of charge to display OEMs, that deals with all other aspects of the technology.
TH: Why is the Asus MG279Q limited to a 35 to 90Hz variable refresh rate in FreeSync mode?
Hallock: That's a question for Asus. FreeSync, as a technology, supports a 9 to 240Hz native variable refresh rate range. It's important to not confuse limitations of FreeSync as a technology with the design choices that display OEMs choose to make, or to associate specific issues or limitations with individual products with the technology more broadly. For instance, the upcoming Nixeus Vue 24″ will support FreeSync over a 30 to 144Hz range.
TH: Will you be promoting and/or supporting frame time prediction in future displays (to support pixel overdrive and variable-timed strobing)?
Hallock: These are display-OEM choices. We are aware that Asus recently introduced pixel overdrive in combination with FreeSync. We are not aware of the details of Asus' implementation.
(Ed.: Hallock told us that one thing AMD will never do is buffer frames or do anything that introduces greater latency.)
TH: Will FreeSync support windowed mode?
Hallock: We are looking into (windowed functionality). We don't have a timetable for that yet.
Current page: Interview With AMDPrev Page Adaptive-Sync Next Page Hands-On FreeSync Testing
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
more expensive but gr8 nvidia gsync i like itReply
Imo freesync has 2 advantages over gsync:Reply
1) price. No additional hardware required makes it relatively cheap. Gsync does cost substantially more.
2) ease of implementation. It is very easy for a monitor maker to do the basics and slap a freesync sticker on a monitor. Gsync is obviously harder to add.
However it also has 2 major disadvantages:
1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.
2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it.
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.Reply
I still think the concept of V-Sync must die because there's no real reason for it to exist any more. There are no displays that require precise timing that need V-Syncing to begin with. The only timing that should exist is the limit of the display itself to transition to another pixel.Reply
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.Especially if it's supposedly an "open" standard.
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.It was kind of the highlight of the article.
(Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)It is a shame that AMD is not pushing for some more standardization on these freesync enabled displays. A competition to ULMB would also be nice to see for games that already have steady frame rates.
Of course you think NVIDIA solution will win. You always do. This forum is becoming more and more bias.Reply
As stated in the article, modern LCDs still require some timing guarantees to drive pixels since the panel parameters to switch pixels from one brightness to another change depending on the time between refreshes. If you refresh the display at completely random intervals, you get random color errors due to fade, over-drive, under-drive, etc.16718305 said:I still think the concept of V-Sync must die because there's no real reason for it to exist any more.
While LCDs may not technically require vsync in the traditional CRT sense where it was directly related to an internal electromagnetic process, they still have operational limits on how quickly, slowly or regularly they need to be refreshed to produce predictable results.
It would have been more technically correct to call those limits min/max frame times instead of variable refresh rates but at the end of the day, the relationship between the two is simply f=1/t, which makes them effectively interchangeable. Explaining things in terms of refresh rates is simply more intuitive for gamers since it is almost directly comparable to frames per second.
Freesync will clearly win, as a $200 price difference isn't trivial for most of us.Reply
Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.
A suggestion for the high end frame rate issue with FreeSync, turn on FRTC and set it's maximum rate to the top end of the monitors sync range.Reply
Very interesting read. I never knew that variable refresh rates had effects on light strobing.Reply
I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?