AMD FreeSync Versus Nvidia G-Sync: Readers Choose

The Setup

Two years ago, I put together a quick little event in Bakersfield, CA to evaluate the effectiveness of AMD’s frame pacing driver (Radeon HD 7990 Vs. GeForce GTX 690: The Crowd Picks A Winner) from an experiential standpoint. We gleaned some interesting information, talked technology and drank some beer at a local brewery. It was an all-around great time. As AMD’s FreeSync introduction came and went, we collectively decided that something similar, but on a larger scale, would be the best way to compare both variable refresh capabilities.

Editor-in-Chief Fritz Nelson covered the highlights of this most recent get-together in Tom's Hardware Readers Pit G-Sync Vs FreeSync In Weekend Battle, but a lot more went into the planning than our highlight reel shows. Even if this was being called an event — you know, for the purposes of everyone getting to enjoy themselves — it would be handled like a proper laboratory experiment.

Image
Sapphire R9 390x
Image
EVGA GTX 970

The first step, taken months ago, was letting AMD and Nvidia know our intentions and securing their support. Both companies were quick to climb on board, putting the onus on us to create a comparison both sides would consider fair. The hardware had to be decided on first, of course. By the time we were ready to commit, AMD's 300 series had launched. So, Nvidia proposed using GeForce GTX 970 and pitting it against Radeon R9 390X. The GM204-based board could be overclocked to match Grenada XT’s performance, the company assured us. In retrospect, we should have countered that the R9 390 would be a more fitting match. But given the advantage it was being handed, AMD readily accepted that pairing.


MORE: Best Graphics Cards For The Money
MORE: All Graphics Content
MORE: Graphics Cards in the Forum

Image
Acer XB270HU
Image
ASUS MG279Q

Picking the monitors that both GPUs would drive proved to be more contentious. I wanted a comparison between FreeSync and G-Sync at 2560x1440, which meant eliminating the LCD as a variable. Right away, our options were narrowed to two: Acer’s XB270HU (G-Sync) vs. Asus’ MG279Q (FreeSync) or Asus’ PG278Q (G-Sync) vs. BenQ’s XL2730Z (FreeSync). The former would give us two IPS-based screens, while the latter shifted to TN. We’ve long dreamed of the day when fast-refresh IPS monitors were readily available at QHD resolutions, so that became our first choice. There was just one wrinkle: Asus’ MG279Q has a variable refresh range of 35 to 90Hz, whereas the BenQ can do 40 to 144Hz. So, while the Asus is perhaps a superior example of a gaming monitor, it’s not necessarily the best representation of FreeSync if your graphics subsystem regularly pushes above 90 FPS. Nevertheless, both AMD and Nvidia signed off on our preference for IPS.


MORE: Best Computer Monitors
MORE: All Monitor Content
MORE: Displays in the Forums

Next, we needed to build platforms around these graphics subsystems. Whereas my Bakersfield frame pacing event involved two PCs — one powered by AMD and the other by Nvidia — this experiment was to be grander. We envisioned eight total machines and two spares, which transcended our capacity for sourcing components and building our own boxes during off hours. So, we approached Digital Storm, a long-time supporter of the site, to gauge its interest in what we were doing. It too signed on right off the bat, offering 10 of its Vanquish 3 systems in Level 4 trim, keyboards, mice and headsets. The company even upgraded them to Core i7s, understanding our desire to factor out any potential platform bottleneck from a gaming comparison.

Digital Storm Vanquish 3 (Level 4 Trim)

Image
Intel i7 4790K
Image
ASUS Z97-E Intel Motherboard
Image
ADATA XPG V2 8GB 1600MHz
Image
Samsung 850 EVO
Image
Seagate Barracuda 1TB
Image
ZALMAN CNPS5X Performa
Image
Corsair Graphite 230T
Image
Corsair CX750M
Image
Corsair M65 Vengeance
Image
Corsair K95 RGB
Image
Corsair Vengeance 1500 V2

With a hardware foundation in place, it was time to formulate a plan for testing. Eight machines meant we could parallelize the process. And, depending on the venue, we could run for five or six hours if necessary, cycling groups through at regular intervals. While we originally hoped to get each volunteer through four games on both technologies, the math just didn’t add up. Eight unique test points per participant times, say, five minutes per experience would add up to an hour after switching seats, games and settings. Instead, we’d give two groups of four a chance to play two games in front of G-Sync and the same for FreeSync.

Naturally, game selection was its own issue, and it quickly became clear that both AMD and Nvidia knew where their respective strengths and weaknesses would materialize. I’ll spare you the politicking, but we eventually settled on a pair of titles commonly associated with AMD (Battlefield 4 and Crysis 3), and two others in Nvidia’s camp (Borderlands: The Pre Sequel and The Witcher 3). Battlefield and Borderlands were deemed our “faster-paced” selections, while Crysis and The Witcher were on the slower side. At the end of the day, though, as a PC gamer, I wanted each title to run at the lushest playable detail settings possible. That’d prove to be another topic for zero-day debate between the two companies. Admittedly, any preference toward quality or performance is going to be subjective. We conceded that both schools of thought are valid, and one gamer's preference may even change depending on genre.

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • Wisecracker
    With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.

    Reply
  • NethJC
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
    Reply
  • AndrewJacksonZA
    Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)
    Reply
  • Vlad Rose
    So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.
    Reply
  • jkrui01
    as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.
    Reply
  • loki1944
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
    Reply
  • Traciatim
    It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

    I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image.
    Reply
  • Calculatron
    Huzzah!
    Reply
  • cats_Paw
    "Our community members in attendance now know what they were playing on. "
    Thats when you lost my interest.

    It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).

    too bad since the article seemed to be very interesting.
    Reply
  • omgBlur
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.

    Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.
    Reply