FreeSync: AMD's Approach To Variable Refresh Rates

AMD's FreeSync technology is gaining momentum, but is the buzz warranted? We did our own research, worked with Acer and spoke with AMD to find out.

Introduction

Human ingenuity works in incredible but mysterious ways. We somehow managed to put a man on the moon (1969) before realizing that adding wheels to luggage was a good idea (Sadow's patent, 1970). In similar (although maybe not as spectacular) fashion, it took more than a decade after the introduction of PC LCD displays for people to realize that there really was no reason for them to operate using a fixed refresh rate. This first page is dedicated to answering why fixed refresh rates on LCDs are even a thing. First, we need to explain how contemporary video signaling works. Feel free to skip ahead if you're not interested in a bit of PC history.

Back in the '80s, cathode ray tubes (CRTs) used in TVs needed a fixed refresh rate because they physically had to move an electron gun pixel by pixel, then line by line and, once they reached the end of the screen, re-position the gun at the beginning. Varying the refresh rate on the fly was impractical, at best. All of the supporting technology standards that emerged in the '80s, '90s and early '00s revolved around that necessity.

The most notable standard involved in controlling signaling from graphics processing units (GPUs) to displays is VESA's Coordinated Video Timings ("CVT," and also its "Reduced Blanking" cousins, "CVT-R" and "CVT-R2"), which, in 2002-2003, displaced the analog-oriented Generalized Timing Formula that had been the standard since 1999. CVT became the de facto signaling standard for both the older DVI and newer DisplayPort interfaces.

Like its predecessor, Generalized Timing Formula ("GTF"), CVT operates on a fixed "pixel clock" basis. The signal includes horizontal blanking and vertical blanking intervals, and horizontal frequency and vertical frequency. The pixel clock itself (which, together with some other factors, determines the interface bandwidth) is negotiated once and cannot easily be varied on the fly. It can be changed, though that typically makes the GPU and display go out of sync. Think about when you change your display's resolution in your OS, or if you've ever tried EVGA's "pixel clock overlocker."

Now, in the case of DisplayPort, the video stream attributes (together with other information used to regenerate the clock between the GPU and display) are sent as so-called "main stream attributes" every VBlank, that is, during each interval between frames.

LCDs were built around this technology ecosystem, and thus naturally adopted many related approaches: fixed refresh rates, pixel-by-pixel and line-by-line refreshing of the screen (as opposed to a single-pass global refresh) and so on. Also, for simplicity, LCDs historically had a fixed backlight to control brightness.

Fixed refresh rates offered other benefits for LCDs that have only more recently started to be exploited. Because the timing between each frame is known in advance, so-called overdrive techniques can be implemented easily, thus reducing the effective response time of the display (minimizing ghosting). Furthermore, LCD backlights could be strobed rather than set to always-on, resulting in reduced pixel persistence at a set level of brightness. Both technologies are known by various vendor-specific terms, but "pixel transition overdrive" and "LCD backlight strobing" can be considered the generic versions.

Why Are Fixed Display Refresh Rates An Issue?

GPUs inherently render frames at variable rates. Historically, LCDs have rendered frames at a fixed rate. So, until recently, only two options were available to frustrated PC gamers:

  • Sync the GPU rate to the LCD rate and duplicate frames when necessary—so-called "turn v-sync on", which results in stuttering and lag.
  • Do not sync the GPU rate to the LCD rate, and send updated frames mid-refresh—so-called "turn v-sync off", which results in screen tearing.

Without G-Sync or FreeSync, there simply was no solution to the above trade-off, and gamers are forced to choose between the two.

Enter FreeSync And G-Sync

Although a variable refresh-rate standard had long existed for the mobile market (mostly for power-saving benefits), Nvidia was the first to realize the potential of introducing variable refresh rates to desktop gaming-oriented LCDs. The company's solution launched as a proprietary "closed" system dubbed G-Sync. AMD followed suit by announcing "FreeSync," which hinged on an optional standard by VESA under the name "Adaptive-Sync".

The main difference between FreeSync and Adaptive-Sync is that Adaptive-Sync is, strictly speaking, just a DisplayPort protocol addition, whereas FreeSync involves the whole render chain (GPU, DisplayPort protocol and display). It would be correct to say that Adaptive-Sync is a component of FreeSync, or that FreeSync builds and expands upon Adaptive-Sync.

Both AMD's FreeSync and Nvidia's G-Sync operate by manipulating some features of the DisplayPort stream to enable a variable Vblank interval that, in turn, results in a variable refresh rate.

One key difference between FreeSync and G-Sync is that Nvidia actually went through the process of designing a custom LCD scaler (based on an expensive component called a field-programmable gate array or FPGA), whereas AMD entered into agreements with leading scaler manufacturers to support FreeSync in their future products. This difference is very important. The broad implication is that Nvidia will have tighter control of G-Sync's operation, but needs to price its solution higher than AMD's. Because of the components involved, G-Sync displays cost, and will likely continue to cost, $150 to $200 more than their AMD counterparts.

Because of the way FreeSync was established, AMD has little control over how display manufacturers implement the technology. This led to initial quality control issues, such as the flicker many users reported when using the DisplayPort cable and firmware provided with Acer's XG270HU. Apparently, these issues were fixed by the latest display firmware.

To help mitigate problems like that, AMD said it established a QC process to determine how displays become eligible for the FreeSync brand/logo. Unfortunately, AMD declined to disclose what quality standards will be used to establish a "pass" for display manufacturers. You'll have to take AMD's word that future FreeSync-branded monitors will not demonstrate flickering or artifacts, but you'll still need to take up any issues with the display manufacturer.

To be fair, we should note that G-Sync is not entirely flicker-free either. The most annoying limitation right now with both FreeSync and G-Sync is the flickering in menu and loading screens. Try playing Pillars of Eternity with either technology enabled. It's not fun. Hopefully, future driver updates mitigate the artifacts for both vendors.

We covered Nvidia's G-Sync technology extensively in an earlier article: G-Sync Technology Preview: Quite Literally A Game Changer. Today, we'll go over FreeSync in detail and highlight how it differs from G-Sync.

If you'd like to see a proper blind test of the two technologies, I'd also encourage you to read our testing event, AMD FreeSync Versus Nvidia G-Sync: Readers Choose.

Adaptive-Sync

Foundations First

Thanks to Bill Lempesis, Executive Director of VESA, we were able to both review the full DisplayPort 1.3 specification and preview a draft of the upcoming Adaptive‐Sync update, which is expected to be incorporated as an optional specification to the former standard. In May 2014, a similar addition was made to the earlier DisplayPort 1.2a standard—the first time that Adaptive-Sync was officially referenced as an industry standard.

Overall, the biggest catch in Adaptive-Sync is that it's optional. No VESA member is required to implement or support Adaptive-Sync. The certification routine is separate from that of DisplayPort itself. As of now, Adaptive-Sync does not even have its own logo! Having a display or GPU that exposes a DisplayPort connection is, in and of itself, no guarantee that it will support Adaptive-Sync. Sadly, as with all optional standards, consumers are bound to be confused.

Adaptive-Sync works by leveraging an optional DisplayPort feature. By telling the "sink" (the display) to "ignore main stream attributes," it can effectively use variable Vblank periods, creating a variable refresh rate.

Adaptive-Sync and the FreeSync standard built on top of it requires the specification of a certain range in which a display can operate on a variable refresh basis (for example, 30 to 144Hz). The range is set by the capabilities of the LCD panel and scaler, not by the GPU. The GPU is required to honor such a range requirement. Note that this range may not be, and typically is not, the full range of the display itself. For example, a display that can operate at fixed refresh rates of both 24Hz and 144Hz may opt to expose a variable rate of 35 to 90Hz.

The Adaptive-Sync standard does not cover how the system should behave when frame rates drop outside the supported range (below 30 frames per second or above 144 in a 30 to 144Hz range), except for implicitly stating that the refresh rate should not go above or below that range. It is left up to the GPU/driver combination to determine that.

All in all, the Adaptive-Sync standard is an exceptionally "light" standard. After all, the whole addendum is a mere two pages long. Especially because it's optional, it's unlikely that it will be the quick revolution wanted by gamers hoping for a rapid standardization of variable refresh rates.

Where It Gets Tricky

Two of the most exciting features found in modern LCDs are pixel transition overdrive and, in some of the newer panels, backlight strobing.

Both technologies dramatically improve the response time of LCD displays and lowering image persistence, significantly reducing the "ghosting" effect that has long plagued older LCD displays. If you're lucky enough to be younger than I am and have never played a first-person shooter on a CRT display, you ought to try one of these new strobing displays. They get really close to that experience, less the headache. Plus they weigh some 50 lbs. less. Then again, in this editor's tongue-in-cheek opinion, you're not a hardcore gamer until you've carried your own CRT display to a LAN party.

The image below is the best we've seen to characterize motion blur at different settings. Credit goes to the folks over at testufo.com.

Variable refresh rates pose an exceptional challenge to the two aforementioned technologies. The issue is that they've historically operated on the basis that the timing of frames was known. That is, in a fixed refresh ratio scenario, say 60Hz, every frame lasts 1/60 = 16.7 milliseconds. Therefore, voltages could be overdriven and backlights could be strobed while maintaining color consistency and a consistent global luminosity level.

Now, with FreeSync and G-Sync, the display does not know when the next frame will arrive or how long the current frame will be displayed. Consequently, it cannot easily overdrive pixel voltages without weird color outcomes. It cannot strobe on-demand, as display luminosity would vary in a maddening way as refresh rate varies dynamically. In order for these technologies to work, the display scaler would need to guess when the next frame will come and operate accordingly.

Display manufacturers only recently started to implement so-called variable refresh rate overdrive to help address this issue. We wrote about Nvidia's implementation in Nvidia's G-Sync Updates: Windowed Mode, Notebook Implementation, New Displays. Asus and its OEM partners added something similar to the upcoming MG279Q FreeSync display. Acer's XG270HU reportedly supports overdrive with FreeSync as well, but requires a firmware update and the version we tested didn't have it, unfortunately.

The operation principle of variable refresh rate overdrive is pretty simple. The display scaler guesses the next frame time based on previous frame times and varies the voltage overdrive setpoint accordingly (higher for shorter frame times, lower for longer frame times.) The worst that can happen, after all, is more ghosting than is ideal, and somewhat less accurate colors in scenes in motion. Either way, it works better than just disabling overdrive altogether.

The variable strobing challenge is more difficult to overcome from an engineering standpoint, and, at the time of this writing, no single OEM has even mentioned the possibility of variable pulse-width strobed displays. If low pixel persistence (that is, minimizing motion blur) is your highest concern, you will likely need to sacrifice G-Sync/FreeSync for months, if not years, to come.

Another challenge is windowed mode. On the desktop, there generally is no need for variable refresh rates. Where variable refresh rates come in handy is when you're playing 3D games or using 3D-accelerated applications. The former are usually played in full-screen mode, and display drivers are smart enough to recognize when a full-screen application takes over from Windows' Desktop Window Manager, and can thus enable the variable refresh rate.

But what if you want to play/operate in windowed mode? Windows' Desktop Window Manager is still handling desktop composition, and so you're stuck with fixed refresh rates. As matters stand, G-Sync enables variable refresh rate operation in windowed mode. AMD is looking into the capability, but hasn't estimated a target for its implementation.

My personal experience with G-Sync in windowed mode is a mixed bag at best. While some applications work better than others, Kerbal Space Program, for example, makes the entire desktop flicker to the point where I really wouldn't want G-Sync on at all. The flickering in other applications appears less extreme.

Interview With AMD

The following are excerpts from our conversation with Robert Hallock, head of global technical marketing at AMD:

Tom's Hardware: How are you feeling about the market momentum FreeSync is seeing?

Hallock: Great. We went from three monitors (actively sold in the market) in March to 19 in July. We've fixed all of the early issues that surfaced in the market through firmware or driver updates. We've introduced a quality certification program, and have already failed certain displays, to further guarantee the quality of displays sporting the FreeSync logo.

(Ed.: We were not satisfied with pure product range figures, so we tried asking both Newegg and Asus about actual sales volumes. Unfortunately, both companies declined to comment on relative sales volumes of G-Sync-equipped versus FreeSync-equipped displays.)

TH: What standards does a display sporting the FreeSync logo need to meet?

Hallock: Certain standards pertaining to, for example, a minimum variable refresh rate range of operation in FreeSync mode. We cannot disclose the details, for competitive reasons.

TH: The Adaptive-Sync standard you promoted, and VESA adopted as an optional standard, feels like a rather "light" standard. Is there an aspiration to expand that standard further?

Hallock: Adaptive-Sync as a VESA standard deals with the DisplayPort protocol only. FreeSync is the (higher order) standard, licensed free of charge to display OEMs, that deals with all other aspects of the technology.

TH: Why is the Asus MG279Q limited to a 35 to 90Hz variable refresh rate in FreeSync mode?

Hallock: That's a question for Asus. FreeSync, as a technology, supports a 9 to 240Hz native variable refresh rate range. It's important to not confuse limitations of FreeSync as a technology with the design choices that display OEMs choose to make, or to associate specific issues or limitations with individual products with the technology more broadly. For instance, the upcoming Nixeus Vue 24″ will support FreeSync over a 30 to 144Hz range.

TH: Will you be promoting and/or supporting frame time prediction in future displays (to support pixel overdrive and variable-timed strobing)?

Hallock: These are display-OEM choices. We are aware that Asus recently introduced pixel overdrive in combination with FreeSync. We are not aware of the details of Asus' implementation.

(Ed.: Hallock told us that one thing AMD will never do is buffer frames or do anything that introduces greater latency.)

TH: Will FreeSync support windowed mode?

Hallock: We are looking into (windowed functionality). We don't have a timetable for that yet.

Hands-On FreeSync Testing

This article wouldn't be complete unless we had a hands-on section, would it? With Acer's help, I had the chance to do some real-world testing of the company's XG270HU, which I paired with a Gigabyte R9 290X (GV-R929XOC-4GD 4GB) slotted into an aging but still extremely capable Core i7-950-based system running Windows 7 x64 and AMD's Catalyst display driver version 15.20.1062.

Acer's XG270HU is the first FreeSync display I've used extensively. I've had an Asus ROG Swift PG278Q (G-Sync) on my desk for several months now, so my expectations were based on that screen as a benchmark.

My first impression was less about FreeSync and more about the display itself. Although it supposedly sports the same base characteristics (144Hz TN panel), the Acer looked to be better-built, with much better color and viewing angles than Asus' offering. Conversely, Asus' on-screen display is, at least to me, much easier to use. The LED that changes color to coincide with status on the Asus (white/normal, red/G-Sync, green/3D) was a nice touch that Acer lacks. All in all, though, ignoring FreeSync and G-Sync for a moment, the Acer felt like the display I'd keep if I were forced to choose one. And that's significant, considering that Asus' monitor currently retails for $670 compared to the Acer's $500 price tag.

Setting up FreeSync was even easier than I expected. When I installed the latest Catalyst drivers, a pop-up window informed me that both my GPU and display supported FreeSync, then guided me through the control panel's FreeSync setup. A minute or two later, I was ready to go.

I spent hours getting almost seasick spinning a boat around in the waters of The Witcher 3 and going through the older sights of Columbia in BioShock: Infinite, all the while getting a general feel for how smooth the whole experience was, and for telltale signs of stuttering or tearing.

The simple conclusion: FreeSync worked great. Within its stated range of 40 to 144Hz, and even pushing the GPU close to the lower end of that range, the experience was as consistent as I had gotten used to with G-Sync. It's just amazing to see that AMD managed to pull this off at almost $200 below the prices of comparable solutions from Nvidia.

However, there are a few shortcomings worth mentioning.

At the higher end of the range, when the GPU hits 144+ frames per second, G-Sync actually limits the whole pipeline to 144 FPS (similar, but not quite the same as enabling v-sync). By contrast, FreeSync does not. Frame rates actually go above 144 (about 160+). While AMD touts this as a "feature" of FreeSync, the byproduct is actually highly undesirable: screen tearing comes back when frame rates go higher than 145 FPS, even with FreeSync enabled. Personally, I prefer the frame-limited G-Sync implementation in practice. For most modern games and a majority of scenarios, it won't matter. Not many systems sustain 145+ FPS at 2160p. But as we saw from Borderlands in our testing event in LA, exceptions do exist.

[Update . . . This is a clarifying statement from AMD: "Users have the option to turn v-sync on, which will get rid of tearing once going above the DRR range of the monitor. To clarify, while in the DRR range of the monitor, FreeSync will always have the right of way before v-sync, which means v-sync will only turn active once outside the range." Filippo no longer has a FreeSync monitor, so we're unable to verify this first hand.]

Also, although the colors were definitely more vivid on the Acer than on Asus' screen, I did feel that the Acer suffered from more ghosting. As it turns out, the display unit that we were provided did not have the latest firmware, and consequently was disabling pixel transition overdrive with FreeSync enabled, resulting in the ghosting I experienced. If you buy this display for its FreeSync capabilities, do make sure to get the latest firmware installed.

Followup With Acer

Here are a few follow-up questions I asked Acer, along with responses from the company:

TH: What is the chip model of the Realtek/Novatek/MStar/(other?) scaler in the display?

Acer: We don't provide this information, for competitive reasons.

(Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)

TH: Can you confirm the FreeSync range of the display?

Acer: The FreeSync range is 40 to 144Hz.

TH: Can you confirm that the display does not support backlight strobing?

Acer: It does not support backlight strobing.

TH: Is there any visual way to confirm that FreeSync is operating, other than just trusting that the option is checked in the Catalyst control center?

Acer: You can run the AMD FreeSync demo.

TH: Can you confirm that the overdrive setting is disabled in FreeSync mode (or, apparently, when using a DisplayPort cable in general with a GPU that supports FreeSync), whether that feature is enabled? I understand there should be a firmware update to address this issue. Does the display we used already include the latest firmware with the overdrive/FreeSync fix?

Acer: Users can adjust the OD setting in FreeSync mode. They can select Extreme/Normal/Off manually. The OD implementation is based on the scaler. AMD and our Original Device Manufacturer aligned response time values in FreeSync mode and implemented into the scaler design. AMD checked every Freesync monitor and worked with our product group during project development. We also went through AMD certification.

(Ed.: This is only true with the latest firmware installed. With the stock firmware, overdrive is effectively disabled when a DisplayPort connection is used, regardless of whether FreeSync is enabled or not)

Conclusion

Our prediction on the evolution of variable-refresh-rate technologies

Given the dramatically different approaches Nvidia and AMD are taking, the whole industry is going through an almost textbook market case study. Will the custom-built, higher-priced Nvidia solution win in the long run, or will the open standard-based industry strategy AMD is using see wider adoption?

One thing is certain: The standards battle is further polarizing the whole display chain purchasing process. You will no longer buy a GPU on its own merits. From now on, you will need to carefully think about what GPU/display combination is right for you, as FreeSync and G-Sync displays are not fully interoperable with the opposing GPU vendor. They will work, but you lose the variable refresh rate feature.

If we were to make a prediction, we'd say G-Sync is likely to see greater adoption in the next year or so, both being first to market and more tightly controlled than FreeSync. As FreeSync improves and its adoption picks up, however, the benefits of a higher-priced G-Sync custom solution will erode. By late 2016 or so, FreeSync should offer similar consistency at a much lower price. Nvidia may then try to switch out FPGAs for application-specific integrated circuits (ASICs) to lower the price of G-Sync, but will then likely be forced to sit at the VESA table and the industry will eventually converge on a common (non-optional) standard.

One big push for standardization will come from display OEMs. Having both FreeSync and G-Sync variants of displays forces display OEMs to carry much higher inventories than a single variant for each product. For as long as sales volumes for these new technologies are small, that will not be an issue. As the technologies become increasingly mainstream, however, it will increasingly be a matter OEMs will grumble about.

Until true standardization happens, it will be Betamax versus VHS or Blu-ray vs. HD-DVD all over again. You'll need to make a choice that suits you, knowing that nothing is guaranteed to be future-proof in a standards war scenario.

Either way, we're really glad that AMD's FreeSync is gaining momentum. We've tried it hands-on, and it works just as well as G-Sync. Choice is great for consumers, and we need more of it in this space. When that choice includes approximately $200 in savings on a new technology that actually has real benefits, we're happy to say well-done AMD. Keep it up.

MORE: Best Graphics Cards For The Money
MORE: All Graphics Articles

Filippo L. Scognamiglio Pasini is a Contributing Writer for Tom's Hardware, covering Graphics. Follow him on Twitter.

Follow Tom's Hardware on Twitter, on Facebook and on Google+.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
31 comments
Comment from the forums
    Your comment
    Top Comments
  • InvalidError
    It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
  • Other Comments
  • hatib
    more expensive but gr8 nvidia gsync i like it
  • DbD2
    Imo freesync has 2 advantages over gsync:
    1) price. No additional hardware required makes it relatively cheap. Gsync does cost substantially more.

    2) ease of implementation. It is very easy for a monitor maker to do the basics and slap a freesync sticker on a monitor. Gsync is obviously harder to add.

    However it also has 2 major disadvantages:
    1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.

    2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it.
  • InvalidError
    It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
  • xenol
    I still think the concept of V-Sync must die because there's no real reason for it to exist any more. There are no displays that require precise timing that need V-Syncing to begin with. The only timing that should exist is the limit of the display itself to transition to another pixel.

    Quote:
    It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.

    Especially if it's supposedly an "open" standard.
  • nukemaster
    Quote:
    It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.

    It was kind of the highlight of the article.

    (Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)
    It is a shame that AMD is not pushing for some more standardization on these freesync enabled displays. A competition to ULMB would also be nice to see for games that already have steady frame rates.
  • jkhoward
    Of course you think NVIDIA solution will win. You always do. This forum is becoming more and more bias.
  • InvalidError
    Anonymous said:
    I still think the concept of V-Sync must die because there's no real reason for it to exist any more.

    As stated in the article, modern LCDs still require some timing guarantees to drive pixels since the panel parameters to switch pixels from one brightness to another change depending on the time between refreshes. If you refresh the display at completely random intervals, you get random color errors due to fade, over-drive, under-drive, etc.

    While LCDs may not technically require vsync in the traditional CRT sense where it was directly related to an internal electromagnetic process, they still have operational limits on how quickly, slowly or regularly they need to be refreshed to produce predictable results.

    It would have been more technically correct to call those limits min/max frame times instead of variable refresh rates but at the end of the day, the relationship between the two is simply f=1/t, which makes them effectively interchangeable. Explaining things in terms of refresh rates is simply more intuitive for gamers since it is almost directly comparable to frames per second.
  • Anonymous
    Freesync will clearly win, as a $200 price difference isn't trivial for most of us.

    Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.
  • dwatterworth
    A suggestion for the high end frame rate issue with FreeSync, turn on FRTC and set it's maximum rate to the top end of the monitors sync range.
  • TechyInAZ
    Very interesting read. I never knew that variable refresh rates had effects on light strobing.

    I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?
  • InvalidError
    Anonymous said:
    I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?

    Mostly unchanged, except for negligible (CRT-like) persistence and backlight strobing becoming history.
  • Chris Droste
    Quote:
    Freesync will clearly win, as a $200 price difference isn't trivial for most of us.

    Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.


    This. when the consumer is driving the value of a new standard, price is king. If the quality is mostly similar, and given modern gaming at 1200p,1440p, 2160p,etc no way 90% of gamers will have enough computer to punch through that ceiling on most current games to create the observed screen tearing, then so long as the standard keeps improving via AMD's QC/revisions and the OEMs responding to observed QC or consumer reported issues, even if all that extra work starts adding to the price. most people will still opt to pay $25-75 versus $200, that's just all there is to it. unless nVidia gets in bed with the panel manufacturers (LG? Samsung? Benq? Optronics?) then this will be an uphill fight.
  • nukemaster
    The real issue with backlight strobing and VRR is that you can not simple use the same on time because as the frame rates drop the picture would get darker.

    Think of 1ms on 7 off @ 120hz than think of 1ms on and 15off at 60. That is a much longer time in the dark.

    Since OLED has no backlight they would have to adjust the pixel brightness(something that may or may not be hard to do while keeping the brightness some users want) to compensate for different frequencies.

    On the plus side backlight strobing makes everything so nice to look at as long as the motion is smooth since skips are partially hidden by out own perception combined with sample and hold on non stobes backlights.

    Also backlight stobing needs tweaking to make the most of it. BenQ's default implementation is not as good as lightboost, but has better color. This may be because lightboost expects ed glasses and is designed to compensate for it.
  • AnimeMania
    Since monitor manufacters use more expensive LCD scalers to support higher resolutions, FreeSync allows monitor manufacturers to choose which LCD scaler will work for their monitor. G-Sync's one or two LCD scalers for all monitors might be a disadvantage since NVidia produces their own LCD scalers that have to support a wide range of resolutions from 1080p to 4K, along with those weird ultra-wide monitor resolutions.
    Anything you can throw at it solutions tend to be much more expensive than just good enough to get the job done solutions.
  • Anonymous
    Anonymous said:
    A suggestion for the high end frame rate issue with FreeSync, turn on FRTC and set it's maximum rate to the top end of the monitors sync range.


    this is what i was thinking. i wonder why the article didn't mention this fix. +1 for this post.

    i agree with the conclusion that g-sync has the advantage now but freesync will be in an advantageous position in an year or so.
  • SamSerious
    Another aspect (next to the solution just to cap framerates above 144Hz) is missing in the article.

    It is true that backlight strobing reduces ghosting heavily. I own an Asus VG278-he display. Although i use it with an AMD Card (R9 380 4G) and not a nVidia one i can manually activate strobing via the strobing utility (freeware, available on the blur busters blog). At the highest strobe setting, 10%, you get the impression of a static crt-like picture that is absolutely amazing. But you also get (at least myself and friends who tried it out as well) a headache quite fast. And if its late und the display is the main light source in the room, everything will flicker of course. Just moving your hand in front of the screen is very irritating.
    Some companies like BenQ with their FlickerFree advertising build monitors with a non-PWM-backlight (by dimming the leds via current control) which is a lot friendlier to the eye especially at office use. However it may affect the color neutrality when dimming your screen to very little brightness and therefore it wont be able to get as little bright as a pwm-controlled backlight. In the enthusiast flashlight world pwm-controlled dimming of LEDs is considered a cheap and awful solution.

    What i want to say is that strobing works great and is absolutely stunning, but if you tend to get a headache from things like that, better try it out before buying your GPU and screen just for that feature. For office tasks you will love a non-flickering backlight, thats for sure.

    Apart from that this is just another great article on tomshw that's definitely worth reading.
  • Super_Nova
    Hmm, I wonder but could HMD's like the Rift benefit from G-sync of Free sync?
  • TNT27
    I played around with g-sync for a while, i think all this crap is over-rated, and not worth a penny. Same thing with mechanical keyboards, hell old membrane keyboards had faster response times. Both Doesn't seem to make any difference in game-play for me.
  • alextheblue
    Quote:
    I played around with g-sync for a while, i think all this crap is over-rated, and not worth a penny. Same thing with mechanical keyboards, hell old membrane keyboards had faster response times. Both Doesn't seem to make any difference in game-play for me.

    Go home TNT, you're drunk.
  • Verrin
    I have an Acer XG270HU with the most up-to-date firmware. I got an exchange with Acer when I learned that the OD settings did not function (I was getting ghosting galore). But with everything working as it should, I can confirm that I get absolutely no ghosting with OD settings on normal and extreme. Also, in every game I've tested, if you enable vsync the game will ignore the fact that it's enabled unless you go over the maximum refresh rate of the panel. So FreeSync works fine but will cap it at 144Hz.