AMD FreeSync Versus Nvidia G-Sync: Readers Choose

Welcome To The Big Show

The event’s energy actually kicked into high gear the night before everyone was to show up. Both AMD and Nvidia arrived a day early to help dial everything in to their mutual satisfaction. Along the way, we discovered a couple of configuration points that needed to be discussed.

First was the decision to turn v-sync on or off outside of FreeSync’s effective range (again, on the Asus MG279Q, this is 35 to 90Hz). One company argued it should be left on, while the other countered it should be off. The decision mattered, of course. With v-sync on, dropping below the variable refresh range has a particularly adverse effect on frame rate, while turning it off causes quite a bit of tearing at frame rates above the VRR.

In the end, we settled this by taking it to the community on Facebook, Twitter and our own forums. I counted more than 180 responses on Facebook before siding with the majority and making the call that we’d leave v-sync off, which incidentally coincides with AMD’s default driver behavior. I fired off my decision, set my alarm for 4:30AM and went to bed. The Facebook thread later went on for 560+ responses at last count.

The next morning, I pulled up to Newegg’s facility just after 7:00AM, followed shortly by the crew that’d run the event. As Tom’s Hardware, AMD and Nvidia began fine-tuning game settings on our systems, it became clear that we had different ideas of what constituted the “best” experience.

Borderlands was easy. Even with its most taxing options enabled, this game runs in excess of 140 FPS on both cards. Given a practical ceiling of 90Hz for FreeSync, we’d be outside of the technology’s range indefinitely. Would gamers be able to tell? That’s what we’d be looking for in the data. Crysis and The Witcher were set to serve up good-looking graphics. We used a Very High system spec in the former, with 2x SMAA and High texture resolution keeping the average frame rate in the 50s. The latter was cranked up to its Ultra detail preset, High post-processing effects and HairWorks set to Off. Frame rates on both cards were in the 40s.

Originally, I planned to set Battlefield to its Ultra preset at 2560x1440, exactly the way I play and benchmark the game. This would have kept both boards under 90 FPS through our test sequence. Nvidia wanted to dial quality back to High and turn down MSAA, though, nudging frame rates into the 100s. After watching company representatives demonstrate Battlefield at that level, I can say it’s noticeably smoother. However, we also didn’t want to deliberately hobble AMD, particularly with Borderlands already pushing above the Asus monitor's variable refresh range. So, we ended up making a compromise: one set of systems would run AMD and Nvidia at the Ultra preset I originally intended, while another set pitted AMD at its Ultra preset and Nvidia at High. We’d note the workstation numbers and compare responses after the event to see if the pairs scored differently. The two companies agreed to this, and we wrapped up prep just minutes before the event was scheduled to begin.

After a brief welcome speech, the first group of eight sat down at their assigned PCs and fired up their games. They weren't told what to look for; they only knew they were there to experience G-Sync and FreeSync.

From that point on, the day would consist of wave after wave being escorted in, playing for several minutes, switching machines to play the same game on the competing technology, firing up a second game, playing and then switching back. At the end of each round, everyone got up and walked over to four notebooks we had set up with a SurveyGizmo questionnaire. Participants were asked to refrain from discussing their experience.

There was a short break in the middle there where we all hit up the taco truck and shotgunned a Monster, but it was pretty much non-stop action until the end. When the last of the 48 participants submitted his survey, Zalman kicked off our raffle, which continued with thousands of dollars’ worth of swag from MSI, AMD and Nvidia. We concluded with a big reveal, giving away the identities of the PCs they used. The day was exhausting without question, but we all agreed that it was a success.

Our community members in attendance now know what they were playing on. However, they don’t know whether G-Sync or FreeSync won the audience’s favor. Let’s get into the results. [Editor's note: Just to clarify, based on some early comments, the community members only know what they were playing on at the very end of the day, after all participants had played and all surveys had been filled out.]

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • Wisecracker
    With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.

    Reply
  • NethJC
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
    Reply
  • AndrewJacksonZA
    Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)
    Reply
  • Vlad Rose
    So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.
    Reply
  • jkrui01
    as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.
    Reply
  • loki1944
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
    Reply
  • Traciatim
    It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

    I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image.
    Reply
  • Calculatron
    Huzzah!
    Reply
  • cats_Paw
    "Our community members in attendance now know what they were playing on. "
    Thats when you lost my interest.

    It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).

    too bad since the article seemed to be very interesting.
    Reply
  • omgBlur
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.

    Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.
    Reply