Hands-On FreeSync Testing
This article wouldn't be complete unless we had a hands-on section, would it? With Acer's help, I had the chance to do some real-world testing of the company's XG270HU, which I paired with a Gigabyte R9 290X (GV-R929XOC-4GD 4GB) slotted into an aging but still extremely capable Core i7-950-based system running Windows 7 x64 and AMD's Catalyst display driver version 15.20.1062.
Acer's XG270HU is the first FreeSync display I've used extensively. I've had an Asus ROG Swift PG278Q (G-Sync) on my desk for several months now, so my expectations were based on that screen as a benchmark.
My first impression was less about FreeSync and more about the display itself. Although it supposedly sports the same base characteristics (144Hz TN panel), the Acer looked to be better-built, with much better color and viewing angles than Asus' offering. Conversely, Asus' on-screen display is, at least to me, much easier to use. The LED that changes color to coincide with status on the Asus (white/normal, red/G-Sync, green/3D) was a nice touch that Acer lacks. All in all, though, ignoring FreeSync and G-Sync for a moment, the Acer felt like the display I'd keep if I were forced to choose one. And that's significant, considering that Asus' monitor currently retails for $670 compared to the Acer's $500 price tag.
Setting up FreeSync was even easier than I expected. When I installed the latest Catalyst drivers, a pop-up window informed me that both my GPU and display supported FreeSync, then guided me through the control panel's FreeSync setup. A minute or two later, I was ready to go.
I spent hours getting almost seasick spinning a boat around in the waters of The Witcher 3 and going through the older sights of Columbia in BioShock: Infinite, all the while getting a general feel for how smooth the whole experience was, and for telltale signs of stuttering or tearing.
The simple conclusion: FreeSync worked great. Within its stated range of 40 to 144Hz, and even pushing the GPU close to the lower end of that range, the experience was as consistent as I had gotten used to with G-Sync. It's just amazing to see that AMD managed to pull this off at almost $200 below the prices of comparable solutions from Nvidia.
However, there are a few shortcomings worth mentioning.
At the higher end of the range, when the GPU hits 144+ frames per second, G-Sync actually limits the whole pipeline to 144 FPS (similar, but not quite the same as enabling v-sync). By contrast, FreeSync does not. Frame rates actually go above 144 (about 160+). While AMD touts this as a "feature" of FreeSync, the byproduct is actually highly undesirable: screen tearing comes back when frame rates go higher than 145 FPS, even with FreeSync enabled. Personally, I prefer the frame-limited G-Sync implementation in practice. For most modern games and a majority of scenarios, it won't matter. Not many systems sustain 145+ FPS at 2160p. But as we saw from Borderlands in our testing event in LA, exceptions do exist.
[Update . . . This is a clarifying statement from AMD: "Users have the option to turn v-sync on, which will get rid of tearing once going above the DRR range of the monitor. To clarify, while in the DRR range of the monitor, FreeSync will always have the right of way before v-sync, which means v-sync will only turn active once outside the range." Filippo no longer has a FreeSync monitor, so we're unable to verify this first hand.]
Also, although the colors were definitely more vivid on the Acer than on Asus' screen, I did feel that the Acer suffered from more ghosting. As it turns out, the display unit that we were provided did not have the latest firmware, and consequently was disabling pixel transition overdrive with FreeSync enabled, resulting in the ghosting I experienced. If you buy this display for its FreeSync capabilities, do make sure to get the latest firmware installed.
Current page: Hands-On FreeSync TestingPrev Page Interview With AMD Next Page Followup With Acer
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
more expensive but gr8 nvidia gsync i like itReply
Imo freesync has 2 advantages over gsync:Reply
1) price. No additional hardware required makes it relatively cheap. Gsync does cost substantially more.
2) ease of implementation. It is very easy for a monitor maker to do the basics and slap a freesync sticker on a monitor. Gsync is obviously harder to add.
However it also has 2 major disadvantages:
1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.
2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it.
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.Reply
I still think the concept of V-Sync must die because there's no real reason for it to exist any more. There are no displays that require precise timing that need V-Syncing to begin with. The only timing that should exist is the limit of the display itself to transition to another pixel.Reply
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.Especially if it's supposedly an "open" standard.
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.It was kind of the highlight of the article.
(Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)It is a shame that AMD is not pushing for some more standardization on these freesync enabled displays. A competition to ULMB would also be nice to see for games that already have steady frame rates.
Of course you think NVIDIA solution will win. You always do. This forum is becoming more and more bias.Reply
As stated in the article, modern LCDs still require some timing guarantees to drive pixels since the panel parameters to switch pixels from one brightness to another change depending on the time between refreshes. If you refresh the display at completely random intervals, you get random color errors due to fade, over-drive, under-drive, etc.16718305 said:I still think the concept of V-Sync must die because there's no real reason for it to exist any more.
While LCDs may not technically require vsync in the traditional CRT sense where it was directly related to an internal electromagnetic process, they still have operational limits on how quickly, slowly or regularly they need to be refreshed to produce predictable results.
It would have been more technically correct to call those limits min/max frame times instead of variable refresh rates but at the end of the day, the relationship between the two is simply f=1/t, which makes them effectively interchangeable. Explaining things in terms of refresh rates is simply more intuitive for gamers since it is almost directly comparable to frames per second.
Freesync will clearly win, as a $200 price difference isn't trivial for most of us.Reply
Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.
A suggestion for the high end frame rate issue with FreeSync, turn on FRTC and set it's maximum rate to the top end of the monitors sync range.Reply
Very interesting read. I never knew that variable refresh rates had effects on light strobing.Reply
I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?