AMD FreeSync Versus Nvidia G-Sync: Readers Choose

We set up camp at Newegg's Hybrid Center in City of Industry, California to test FreeSync against G-Sync in a day-long experiment involving our readers.

Introduction

Nvidia's G-Sync variable refresh technology was introduced almost two years ago. AMD wasted no time letting us know it had an equivalent capability planned. Many months later, we got our first taste of what came to be known as FreeSync. And now enthusiasts are asking whether to put their money behind AMD or Nvidia. Rather than answering that largely subjective question in our own unilateral way, we took it to the community by setting up a blind taste test of sorts. The experiment was a great success, and we're ready to share our results.

It probably wouldn’t surprise you to learn that reviewing PC hardware is a fairly solitary occupation. Sure, you sit in on Skype calls and answer a seemingly endless stream of emails. But it becomes easy to forget that there are millions of enthusiasts around the world looking for guidance. They don’t want every new technology to receive a soft recommendation delicately worded to avoid hurting anyone’s feelings. Rather, they pledge allegiance to hard info and rely on technically-minded writers able to distill hundreds of data points into a conclusion: buy this or skip it.

Sometimes that’s an easy call to make. More often it’s not, even with benchmarks illuminating the path. But when a technology can’t be easily put through a suite of quantitative metrics... well, then you’re making a purchasing decision based on our good word. And no doubt that makes both of us just a little bit uncomfortable.

When Nvidia introduced its G-Sync technology, we published G-Sync Technology Preview: Quite Literally A Game Changer. In that piece, we covered technical details, features, configuration and early subjective impressions. But we were off the hook, in a way. Our sample was pre-production. Compatible monitors still weren’t selling yet. And we had the word preview right in our title. Still, we knew Nvidia was onto something, even if we couldn't predict G-Sync's uptake.

The market is so much different today. Not only is G-Sync all over the place — available in 24-, 27- and 28-inch form factors, using TN or IPS panels — but screens with AMD’s FreeSync technology are starting to become more popular, too. We count seven compatible models on Newegg as of this writing, and Filippo, who co-authored the G-Sync launch with me, is working on his deep dive into FreeSync. Really, it’s time to take a stand.

Despite Nvidia’s staunch defense of its proprietary approach and AMD’s evangelizing of a standardized tack, both require a commitment to one company’s hardware or the other. There is no mix and match. AMD can't support G-Sync, and Nvidia won't capitulate on FreeSync. So which deserves your investment, G-Sync or FreeSync?

There is no easy way to benchmark the differences between them, making it perplexingly difficult to drench you in data. And we’ve been doing this long enough to know that any proclamation based on one individual’s experience will become a punching bag for the comments section. Clearly, this needed to become a group project.

The Setup

Two years ago, I put together a quick little event in Bakersfield, CA to evaluate the effectiveness of AMD’s frame pacing driver (Radeon HD 7990 Vs. GeForce GTX 690: The Crowd Picks A Winner) from an experiential standpoint. We gleaned some interesting information, talked technology and drank some beer at a local brewery. It was an all-around great time. As AMD’s FreeSync introduction came and went, we collectively decided that something similar, but on a larger scale, would be the best way to compare both variable refresh capabilities.

Editor-in-Chief Fritz Nelson covered the highlights of this most recent get-together in Tom's Hardware Readers Pit G-Sync Vs FreeSync In Weekend Battle, but a lot more went into the planning than our highlight reel shows. Even if this was being called an event — you know, for the purposes of everyone getting to enjoy themselves — it would be handled like a proper laboratory experiment.

The first step, taken months ago, was letting AMD and Nvidia know our intentions and securing their support. Both companies were quick to climb on board, putting the onus on us to create a comparison both sides would consider fair. The hardware had to be decided on first, of course. By the time we were ready to commit, AMD's 300 series had launched. So, Nvidia proposed using GeForce GTX 970 and pitting it against Radeon R9 390X. The GM204-based board could be overclocked to match Grenada XT’s performance, the company assured us. In retrospect, we should have countered that the R9 390 would be a more fitting match. But given the advantage it was being handed, AMD readily accepted that pairing.

MORE: Best Graphics Cards For The Money
MORE: All Graphics Content
MORE: Graphics Cards in the Forum

Picking the monitors that both GPUs would drive proved to be more contentious. I wanted a comparison between FreeSync and G-Sync at 2560x1440, which meant eliminating the LCD as a variable. Right away, our options were narrowed to two: Acer’s XB270HU (G-Sync) vs. Asus’ MG279Q (FreeSync) or Asus’ PG278Q (G-Sync) vs. BenQ’s XL2730Z (FreeSync). The former would give us two IPS-based screens, while the latter shifted to TN. We’ve long dreamed of the day when fast-refresh IPS monitors were readily available at QHD resolutions, so that became our first choice. There was just one wrinkle: Asus’ MG279Q has a variable refresh range of 35 to 90Hz, whereas the BenQ can do 40 to 144Hz. So, while the Asus is perhaps a superior example of a gaming monitor, it’s not necessarily the best representation of FreeSync if your graphics subsystem regularly pushes above 90 FPS. Nevertheless, both AMD and Nvidia signed off on our preference for IPS.

MORE: Best Computer Monitors
MORE: All Monitor Content

MORE: Displays in the Forums

Next, we needed to build platforms around these graphics subsystems. Whereas my Bakersfield frame pacing event involved two PCs — one powered by AMD and the other by Nvidia — this experiment was to be grander. We envisioned eight total machines and two spares, which transcended our capacity for sourcing components and building our own boxes during off hours. So, we approached Digital Storm, a long-time supporter of the site, to gauge its interest in what we were doing. It too signed on right off the bat, offering 10 of its Vanquish 3 systems in Level 4 trim, keyboards, mice and headsets. The company even upgraded them to Core i7s, understanding our desire to factor out any potential platform bottleneck from a gaming comparison.

Digital Storm Vanquish 3 (Level 4 Trim)











With a hardware foundation in place, it was time to formulate a plan for testing. Eight machines meant we could parallelize the process. And, depending on the venue, we could run for five or six hours if necessary, cycling groups through at regular intervals. While we originally hoped to get each volunteer through four games on both technologies, the math just didn’t add up. Eight unique test points per participant times, say, five minutes per experience would add up to an hour after switching seats, games and settings. Instead, we’d give two groups of four a chance to play two games in front of G-Sync and the same for FreeSync.

Naturally, game selection was its own issue, and it quickly became clear that both AMD and Nvidia knew where their respective strengths and weaknesses would materialize. I’ll spare you the politicking, but we eventually settled on a pair of titles commonly associated with AMD (Battlefield 4 and Crysis 3), and two others in Nvidia’s camp (Borderlands: The Pre Sequel and The Witcher 3). Battlefield and Borderlands were deemed our “faster-paced” selections, while Crysis and The Witcher were on the slower side. At the end of the day, though, as a PC gamer, I wanted each title to run at the lushest playable detail settings possible. That’d prove to be another topic for zero-day debate between the two companies. Admittedly, any preference toward quality or performance is going to be subjective. We conceded that both schools of thought are valid, and one gamer's preference may even change depending on genre.

Meet The Players

The logistics of this scaled-up undertaking were formidable, and I have to give props to the Tom’s Hardware crew for putting everything together. I came up with the concept for a comparative experiment, but many other minds executed it.

To begin, there was AMD and Nvidia. Both companies were given the playbook in as much detail as possible each step of the way. In return, they weighed in with feedback, blessed the methodologies and, to our delight, sent representatives to attend. This proved much more valuable than I could have imaged. During setup, the two companies were able to properly configure their respective workstations, overclock to the appropriate levels (in Nvidia’s case) and watch over each others’ shoulders to confirm no foul play. They also came bearing gifts—at the end of our festivities we were able to give away several graphics cards, jerseys, water bottles and even a gaming console.

As mentioned, Digital Storm contributed in a huge way by getting our systems built and shipped out. The company wasn’t able to be there, but its foresight in sending spare machines saved us from shipping damage and one overworked power supply. It was a shame that preserving the experiment’s integrity meant hiding the boxes from view. They really were clean builds. We just couldn’t risk someone peeking inside or catching a glimpse of a rear I/O panel.

We needed a venue large enough to host more attendees, more sponsors and a lot more hardware. Newegg stepped up and let us use its Hybrid Center in City of Industry. Additionally, several Newegg employees showed up bright and early on a Saturday morning to help us set up and to support us throughout the day. They were the first to arrive and the last to leave; we couldn’t have done it without them. On a tangent, if you live in Southern California and shop at Newegg, having the Hybrid Center hold your order for will-call gives you a great excuse to get hands-on with a lot of the technology out on display. It’s really a cool destination tucked away amongst the huge warehouses that typify the area.

The day was destined to be a long one, so we considered it imperative to have activities on-hand to keep our readers entertained. MSI and Zalman both showed up in force with gaming laptops, peripherals, motherboards and the manpower to field questions all day long. Both companies occupied tents outside, flanked by plenty of food and drink (again, provided by Newegg), as each group of eight attendees cycled through the “lab” we built in the Hybrid Center’s lobby. Inside, AMD showcased its prototype of Quantum, with two Fiji GPUs inside, powering Oculus Rift and Crytek's Back to Dinosaur Island 2 demo. Everyone who expressed interest was ushered to a back room, introduced to Quantum and allowed some time to experience a bit of gameplay that few others have seen. A special thanks to AMD for giving our readers the chance to go hands-on with such an exclusive treat.

Of course, I also have to thank the Tom’s Hardware staffers who spent time installing games onto the test platforms, building covers to hide the monitor bezels and stands, registering members of the Tom’s Hardware community online so we knew who to expect, checking everyone in on-site and proctoring the electronic questionnaire that collected data after each group finished the experiment.

Welcome To The Big Show

The event’s energy actually kicked into high gear the night before everyone was to show up. Both AMD and Nvidia arrived a day early to help dial everything in to their mutual satisfaction. Along the way, we discovered a couple of configuration points that needed to be discussed.

First was the decision to turn v-sync on or off outside of FreeSync’s effective range (again, on the Asus MG279Q, this is 35 to 90Hz). One company argued it should be left on, while the other countered it should be off. The decision mattered, of course. With v-sync on, dropping below the variable refresh range has a particularly adverse effect on frame rate, while turning it off causes quite a bit of tearing at frame rates above the VRR.

In the end, we settled this by taking it to the community on Facebook, Twitter and our own forums. I counted more than 180 responses on Facebook before siding with the majority and making the call that we’d leave v-sync off, which incidentally coincides with AMD’s default driver behavior. I fired off my decision, set my alarm for 4:30AM and went to bed. The Facebook thread later went on for 560+ responses at last count.

The next morning, I pulled up to Newegg’s facility just after 7:00AM, followed shortly by the crew that’d run the event. As Tom’s Hardware, AMD and Nvidia began fine-tuning game settings on our systems, it became clear that we had different ideas of what constituted the “best” experience.

Borderlands was easy. Even with its most taxing options enabled, this game runs in excess of 140 FPS on both cards. Given a practical ceiling of 90Hz for FreeSync, we’d be outside of the technology’s range indefinitely. Would gamers be able to tell? That’s what we’d be looking for in the data. Crysis and The Witcher were set to serve up good-looking graphics. We used a Very High system spec in the former, with 2x SMAA and High texture resolution keeping the average frame rate in the 50s. The latter was cranked up to its Ultra detail preset, High post-processing effects and HairWorks set to Off. Frame rates on both cards were in the 40s.

Originally, I planned to set Battlefield to its Ultra preset at 2560x1440, exactly the way I play and benchmark the game. This would have kept both boards under 90 FPS through our test sequence. Nvidia wanted to dial quality back to High and turn down MSAA, though, nudging frame rates into the 100s. After watching company representatives demonstrate Battlefield at that level, I can say it’s noticeably smoother. However, we also didn’t want to deliberately hobble AMD, particularly with Borderlands already pushing above the Asus monitor's variable refresh range. So, we ended up making a compromise: one set of systems would run AMD and Nvidia at the Ultra preset I originally intended, while another set pitted AMD at its Ultra preset and Nvidia at High. We’d note the workstation numbers and compare responses after the event to see if the pairs scored differently. The two companies agreed to this, and we wrapped up prep just minutes before the event was scheduled to begin.

After a brief welcome speech, the first group of eight sat down at their assigned PCs and fired up their games. They weren't told what to look for; they only knew they were there to experience G-Sync and FreeSync.

From that point on, the day would consist of wave after wave being escorted in, playing for several minutes, switching machines to play the same game on the competing technology, firing up a second game, playing and then switching back. At the end of each round, everyone got up and walked over to four notebooks we had set up with a SurveyGizmo questionnaire. Participants were asked to refrain from discussing their experience.

There was a short break in the middle there where we all hit up the taco truck and shotgunned a Monster, but it was pretty much non-stop action until the end. When the last of the 48 participants submitted his survey, Zalman kicked off our raffle, which continued with thousands of dollars’ worth of swag from MSI, AMD and Nvidia. We concluded with a big reveal, giving away the identities of the PCs they used. The day was exhausting without question, but we all agreed that it was a success.

Our community members in attendance now know what they were playing on. However, they don’t know whether G-Sync or FreeSync won the audience’s favor. Let’s get into the results. [Editor's note: Just to clarify, based on some early comments, the community members only know what they were playing on at the very end of the day, after all participants had played and all surveys had been filled out.]

Test Results: FreeSync or G-Sync?

We begin with the overall result, and then drill down from there. Resist the temptation to take this first chart and use it exclusively as the basis for your next purchasing decision. After all, there were a couple of different factors in play that need to be discussed in greater depth, and this outcome cannot capture their subtleties.

Twenty-nine (or 60%) of our participants chose G-Sync based on their experience in our experiment. Ten (or ~21%) of the attendees picked FreeSync. And nine (or almost 19%) considered both solutions of equal quality. Given the price premium on G-Sync, you can almost count an undecided vote for AMD, since smart money often goes to the cheaper hardware if it delivers a comparable experience.

We expected Nvidia to secure the majority in our overall tally for a couple of reasons. First, Borderlands, which half of our respondents played, ran outside of the FreeSync-equipped Asus monitor’s variable refresh range. So we need to assess how many surveys cited this title as their reason for choosing G-Sync. Second, we created a side experiment on two computers using Battlefield 4, which allowed FreeSync to remain under 90 FPS (in its variable refresh range) while Nvidia’s hardware pushed higher frame rates through dialed-back quality options. One-quarter of our attendees would have unknowingly participated in this one, and we’ll separately compare their responses to the folks who played at the Ultra preset on both technologies.

To be clear, the MG279Q’s variable range of 35 to 90Hz is not a limitation of FreeSync, but rather a consequence of the scaler Asus chose to outfit its monitor with. We wanted to compare similar-looking panels and had to make some compromises to minimize the variables in play. There was an alternate combination that could have stepped us back to TN technology with a range of 40 to 144Hz, but we do not regret running our tests using IPS. Universally, the audience commented on how gorgeous the screens looked, and we suspect that most of the gamers with Asus’ MG279Q on their list have it there because it offers a native QHD resolution, that AU Optronics AHVA panel and refresh rates up to 144Hz. Regardless of whether you enable FreeSync or not, that’s one heck of a gaming product for $600, or $150 less than the G-Sync-equipped equivalent, as priced on Newegg at the time of this writing. [Editor's note: At publishing time, the price of the Acer monitor increased $50. This is reflected in the pricing buttons that link to NewEgg. See page 2. Thus, the price difference grew. Those pricing buttons are dynamic and pull directly from NewEgg, so they also could change over time.]

Asus further shared its plans to launch a TN-based FreeSync-capable screen in early September with a 1ms response time and VRR between 40 and 144Hz. The company will also introduce its G-Sync-equipped IPS line in September, including a 4K/60Hz screen and a QHD/144Hz model.

Eight respondents said there was no single game that stood out to them, and of those, five answered that the two technologies were of equal quality. One respondent made comments, but didn’t cite a specific game that affected him and gave both technologies equal marks.

Crysis was most often cited as the game where a difference was observed, which is difficult to explain technically because it should have been running in a variable refresh range on both competing technologies. Four of those who mentioned Crysis preferred their experience on AMD’s hardware, eight chose Nvidia’s and one said the two technologies were of equal quality, though his Nvidia-based platform did stutter during an intense sequence.

Next was Borderlands, which 11 participants said showed the most difference. Nine of those picked G-Sync as their preference, one chose FreeSync, noting the 390X-equipped machine felt smoother in both Borderlands and The Witcher, and one said they were of equal quality after noticing the tearing on FreeSync, but mentioned a different experience in The Witcher. While we hadn't anticipated it during our planning phase, Borderlands turned out to be a gimmie for Nvidia since the AMD setups were destined to either tear (if we left v-sync off) or stutter/lag (if we switched v-sync on). What we confirmed was that a majority of our respondents could see the difference. It's a good data point to have.

Battlefield demonstrated the most variance for eight attendees. Six folks favored what they saw from G-Sync, one picked FreeSync and the last noted that he experienced smoother performance in this game (perhaps compared to Crysis?), but said the two systems were of equal quality.

The Witcher seemed least-influential—seven participants saw the biggest difference in it. Three of those thought the 390X delivered better responsiveness, while four favored the GTX 970 and G-Sync. As with Crysis, both G-Sync and FreeSync should have been running within their respective variable refresh ranges based on the settings we chose. So, it’s hard to say with certainty what other factors would be affecting game play.

How about that mini-experiment we did with Battlefield 4, running two workstations in the same sub-90 FPS range and two at different settings, keeping FreeSync at its target performance level with higher quality as Nvidia dropped to the High preset for faster frame rates?

Well, of the folks who played on the machines set to Battlefield's Ultra quality preset, three chose the AMD-equipped system, eight went with Nvidia’s hardware and one put them on equal footing. Notably, even when a respondent picked one technology over the other, it was mentioned that they were quite similar and free from tearing. Other in-game differences like perceived frame rates seemed to affect the choices.

Right next to them, we had another AMD machine at Ultra settings and an Nvidia box dialed down to the High preset. Again, three respondents picked AMD’s hardware. Seven went with Nvidia, while two said they were of equal quality. Three participants specifically called out smoothness in Battlefield 4 as something they noticed, and nobody reported lower visual quality, despite the dialed-back quality preset and lack of anti-aliasing on the GeForce-equipped machine. Overall, we'd call those results comparable.

Thirty-four of our 48 participants said that yes, they would pay a premium for the experience they preferred over the other configuration. Thirteen wouldn’t, and one answer wasn’t clear enough to categorize.

Interestingly, nine of those who said they’d spend more on the better experience ended up picking the Radeon R9 390X/Asus MG279Q combination. Twenty-three picked the G-Sync-capable hardware. So, 79% of our respondents who preferred the G-Sync experience would be willing to pay extra for it. How much, though?

We asked our participants to quantify the degree to which they favored one solution over the other, if indeed they had a preference. Not all of the answers we received were usable, but of those that were, 16 said they’d be willing to spend up to $100 more for the better experience. Nine capped the premium around $200. And three felt strongly enough to budget significantly more—one gamer committed to up to twice the spend, one said an extra $200-$300 per year and a third would budget up to $500 more for a high-end gaming PC.

A Little About Our Participants

In addition to the base experiment we wanted to run, we also sought to collect some information from the folks in attendance. We know they’re hardware enthusiasts—they’re reading Tom’s Hardware, after all. But how much time do they spend gaming? Were they confident enough in their answer to identify the machine with AMD’s technology inside? How about Nvidia’s? What are the specs of their current gaming PC? Did they have any self-acknowledged biases to one company or the other, and if so, why?

Right off the bat, we found it interesting that 10 of 48 respondents believed they knew which system was which. Of those 10, nine were correct, though for a variety of reasons. One respondent guessed the 390X-equipped PC based on the heat it was putting out, and indeed, AMD’s representative increased fan speed in Catalyst Control Center to help with stability under the tables we were using. Others cited smoothness issues, though in the games mentioned, both FreeSync and G-Sync would have been within their target ranges, so perceived fluidity could be affected by the game, driver optimizations or slight differences in performance (despite our best efforts to equalize frame rates through clock rate tuning). For what it’s worth, both Nvidia and AMD were on the receiving ends of these judgement calls; it wasn't just one or the other that benefited.

We didn’t qualify readers for our event based on any specific criteria aside from availability. Still, we wanted to know how much time the folks passing on their thoughts spent playing their favorite titles during an average week. You could argue that a seasoned gamer would have a heightened sensitivity to tearing or input lag. Or, there’s the counter-argument that a newbie might be less likely to draw conclusions based on preconceived notions.

That distinction didn’t end up mattering. Thirty-one respondents reported playing for more than 10 hours a week. Sixteen were between five and 10 hours a week. Just one claimed zero to four.

It’s a little more difficult to represent each respondent’s system specs visually, and I’m not even sure there’s a concrete correlation between someone’s primary gaming PC and their preference between the two technologies being compared in our experiment. Nevertheless, I was surprised at how many gamers are running high-end setups.

Although we took great pains to keep the hardware we were testing covered, ensuring personal biases didn’t affect our survey results, we still wanted to gauge the general predilections of our audience. Twenty-five respondents identified as agnostic, four claimed to be AMD fans and 19 were Nvidia fans.

As you might imagine, the reasons readers leaned one direction or the other varied greatly. Regardless of the answer, though, many of the folks who wrote in an explanation did mention favoring whichever solution yielded the best experience. Three of the four AMD fans specifically called out pricing. And the Nvidia fans overwhelmingly cited driver stability as their primary motivator, though efficiency came up several times as well.

Summing The Analysis

We started down the path of comparing AMD’s FreeSync technology to Nvidia’s G-Sync in the hopes that a community-based, hands-on approach would provide additional insight or perhaps lead us to some unexpected conclusion. In some ways it has. But we naturally had some hypotheses going into our event, and those largely proved true, too.

Let’s get the big question out of the way: what’s better, G-Sync or FreeSync? As both technologies stand right now, the G-Sync ecosystem is more mature. From compatible graphics cards to G-Sync-capable displays and variable refresh ranges, Nvidia has the leg up. It also has an advantage when you drop below the variable refresh range. Our experiment never took us to that point, fortunately, so it didn't become an issue we needed to address in the analysis.

Then again, you’ll also pay $150 more for the G-Sync monitor we tested today—a premium that exceeds what most of our respondents claimed they’d be willing to spend for the experience they ultimately preferred, and some of those folks even picked AMD’s less expensive hardware combination as their favorite. [Editor's note: As previously noted, the pricing difference between monitors has increased since we published this article. As always, even that is subject to further change.]

Given the technical similarities of what FreeSync and G-Sync set out to achieve, so long as they’re both in their variable refresh range, all of our sources suggest you should get fundamentally identical results from them. On paper, that is. Differences do crop up outside of those upper and lower bounds, where FreeSync and G-Sync ask you to pick between v-sync on or off, and G-Sync is able to double (or more) screen refreshes when they drop below 30 FPS. We chose to leave v-sync off on the FreeSync-capable machines after surveying our readers’ habits and acknowledging AMD’s default driver behavior. However, it would have been interesting to compare results with v-sync on, particularly in Borderlands.

What about that Asus monitor we chose to represent FreeSync? While we can’t claim to know why the company chose a scaler limited to 90Hz on its 144Hz MG279Q, we’ve heard that it’s selling really well. Our event participants certainly loved it (along with Acer’s XB270HU). A $600 price point certainly isn’t cheap. However, when you’re talking about a 144Hz IPS display, gamers are bound to find a little more room in their budgets. In addition to the enthusiasts buying this screen for its FreeSync support, we suspect that those with high-end graphics subsystems from both GPU vendors are simply running it at 144Hz for fast-paced action on a beautiful-looking panel.

The Bottom Line

AMD is at a disadvantage because it doesn’t have the variable refresh range coverage currently enjoyed by Nvidia. If you spring for the QHD/FreeSync/IPS display we tested with and run a game like Borderlands on a 390X, it’s going to fall outside of 35-90Hz almost exclusively, even if you dial in the most taxing settings possible. Conversely, the QHD/FreeSync/TN screen we could have chosen instead would have likely run into issues with the quality settings we used in The Witcher, which averaged in the 40s, but also dipped lower.

Theoretical similarities between G-Sync and FreeSync aside, we also cannot ignore the fact that a number of our event participants chose the Nvidia solution in games where both FreeSync and G-Sync should have been inside their respective ranges at all times. This happened at a rate of 2:1 in Crysis, and almost 3:1 in Battlefield 4. Those are discrepancies we’d have a tough time attributing to variable refresh. Something else is going on there—a situation made stranger by the fact that both games were picked for their AMD Gaming Evolved affiliations.

Of course, FreeSync is relatively young compared to G-Sync, and we understand that there are hardware and software improvements planned that’ll address some of the technology’s current weaknesses. Technical individuals within AMD and Nvidia acknowledge that, inside the variable refresh range, FreeSync and G-Sync should be equally enjoyable. While our experimental data actually gives Nvidia an edge where one wasn’t expected, paying an extra $150 or more for it may sway certain enthusiasts the other way.

As for us, we’re just glad both technologies exist. Nvidia should be commended for its innovation, which set us on this path almost two years ago. AMD is taking a different approach, and it’s progressing much more slowly. But viable—nay, successful partner products are available, as evidenced by the MG279Q. No doubt FreeSync's lower barrier to entry will be appreciated by more gamers as the line-up of compatible components grows. Might the same sort of experiment held a year in the future yield different results? It’s hard to say. But based on the enthusiasm we saw at Newegg’s Hybrid Center, we’re confident we have the crew for whatever testing is needed.

Post Mortem

There's no such thing as a perfect experiment. We put a ton of time into planning our event, were as transparent as possible with both AMD and Nvidia ahead of it and still ran into zero-day issues that had to be dealt with. As part of our analysis, we thought it important to follow-up with both companies and get their feedback. Specifically, we wanted suggestions on ways to make future events better. Part of this involved facing perceived shortcomings. Some of these came from AMD and Nvidia, and others were noted by our own team.

Let's start with Nvidia's commentary, provided by Tom Petersen, director of technical marketing and one of our attendees.

The side by side blind testing technique is a great way to get some direct feedback from gamers about new technologies. Unfortunately, it is also the case that whenever someone knows what we are testing for they are biased to some extent which could inadvertently impact their testing and feedback.

There are a few techniques that can help mitigate this inherent expectation bias:

1. Double blind studies – the test administrator and the testers should not know what is being tested.

    a. Don’t tell the gamers the purpose of the evaluation – knowledge of this being a G-Sync vs. FreeSync could impact results.

    b. Use volunteers to run the test flow to eliminate the risk of administrators passing along test information

2. Include a control group with nothing new. In this case I would have used one of the monitors in “fixed refresh rate mode.”

3. Increase the sample size. This may be very difficult in practice, but more data is definitely better when science is involved.

Overall I enjoyed the opportunity to engage with THG’s community. I look forward to seeing the results.

We especially like Tom's suggestion to use a control group in a fixed refresh mode for comparison. Given a longer day and perhaps more activities to keep other folks busy, we would like to see gamers on three systems, one of them being a control of some sort.

A larger sample size was on our wish list all along, but there's only so much you can do with eight machines and one Saturday afternoon. This event was already several times as large as our last one, and we'll definitely shoot for something even larger next time.

The idea to keep the purpose of the experiment under wraps is also intriguing, though I'm not sure we'd have as much luck getting volunteers to sign up without some sort of teaser ahead of time. This and volunteer-run testing might be ideal, but they present us with some practical challenges we'll have to think about.

Now for AMD's feedback, which comes to us by way of Antal Tungler, public relations manager, who helped us coordinate the company's participation (including AMD attendees).

AMD is always happy to see this sort of testing become available to end users and community members and people who are just interested in tech and PC gaming. We applaud Tom’s for this initiative regardless of the outcome. AMD FreeSync technology has now been on the market for almost six months and we’ve seen terrific adoption from display vendors: there are now 20 FreeSync-enabled monitors on the market with more on the way.

A couple of thoughts regarding this test:

-Because AMD FreeSync technology enables such a wide variety of display tech and refresh rates for vendors to productize, we believe a true Pepsi-style challenge for DRR displays should aim to keep the frame rates in the DRR zone all the time. It’s also important that all parties run at nearly identical frame rates, which greatly benefits a true Pepsi-style challenge. We’d love to see more emphasis on this in the future.

-Choosing games and settings carefully is paramount to make sure that the scenarios gamers look at are reproducible, consistent and glitch-free. Maybe there’s some room for improvement there.

-One could consider including a DRR specific benchmark, like the AMD Windmill application. While it certainly shouldn’t be the only method of testing, it would be interesting to add to the overall results.

We believe that some last minute changes made before the event (that didn’t necessarily guide the experiment in a true apples-to-apples comparison’s direction) make it difficult to call it a true Pepsi challenge. Regardless, we’re really happy to see Tom’s Hardware putting this much effort into pulling together this event, and are grateful for the opportunity to have participated in it. We’re sure with some of the above changes implemented, there are many more events like this coming in the future that benefit end users and the industry as a whole. Thank you!

The changes AMD is referring to are the zero-hour decision to leave v-sync off outside of its variable refresh range and the side experiment we put together in Battlefield 4. On the first point, I really wish we would have thought to specify v-sync behavior one way or the other back when we were disclosing everything to both companies. But given the majority vote of our readers and AMD's default  behavior, I'm comfortable with where we ended up for the event.

Allowing Nvidia to set one of its systems up with different settings in Battlefield is a fair protest on AMD's part, even if we generated useful data from it. Done over, I would be more adamant that the settings selected before the event needed to be universal, and if we wanted to do a separate experiment, do it during lunch or with stragglers after the official proceedings.

I do, however, disagree that games and settings should be chosen to keep both solutions in their variable refresh range. G-Sync and FreeSync have dissimilar VRRs right now, and that has to factor into any buying decision. Forcing the technologies into their bands, however wide or narrow they might be, overlooks that the bands aren't equal.

My own feedback is more pointed than that of either AMD or Nvidia (both organizations were polite and professional each step of the way). I'd be more tempted, in retrospect, to use TN-based screens, giving AMD a more generous VRR of 40 to 144Hz, if only to see how the Borderlands results would change. This is disappointing because I've maintained for two years that I want three high-refresh IPS panels on my desk for gaming. Stepping back to TN for the broader VRR wouldn't interest me, personally. But that's a reflection of where we're at right now with FreeSync. Hopefully the initiative continues gaining momentum and we see its growing pains remedied.

Still, had we gone with the BenQ screen instead, a lot of the other issues we had on game day might not have arisen. Or maybe we would have figured out something else to argue about. This was a battle between two graphics giants, after all.

MORE: Best Graphics Cards For The Money
MORE: All Graphics Content
MORE: Graphics Cards in the Forum

MORE: Best Computer Monitors
MORE: All Monitor Content

MORE: Displays in the Forums

Chris Angelini is a Technical Editor at Tom's Hardware. Follow him on Twitter and Google+.

Follow Tom's Hardware on Twitter, Facebook and Google+.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
143 comments
Comment from the forums
    Your comment
    Top Comments
  • marraco
    This test means nothing. It should had made with a statistician, the same way that a motherboard review is done by somebody knowledgeable in motherboards.

    48 samples is too low. If you generate 48 samples randomly, most results will look like this, instead of 50% / 50%

    Start excel, generate 1000 samples at random, and it will come close to 50% for each choice (example: 55/45%). But if you generate only 48, most trials will be extremely biased in favor of any alternative.

    In other words, this test is not different than a random result. It means nothing. You can repeat exactly the same experiment and get the opposite results.

    Worse, if 10 out of 50 players (20%) know what is the real hardware, then it gets even more biased.

    Of course, you cannot collect 1000 monitors, but you can cycle the players, and get more players.
  • AndrewJacksonZA
    Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)
  • Traciatim
    It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

    I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image.
  • Other Comments
  • Wisecracker
    With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.
  • NethJC
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
  • AndrewJacksonZA
    Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)
  • Vlad Rose
    So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.
  • jkrui01
    as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.
  • loki1944
    Quote:
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.


    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
  • Traciatim
    It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

    I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image.
  • Calculatron
    Huzzah!
  • cats_Paw
    "Our community members in attendance now know what they were playing on. "
    Thats when you lost my interest.

    It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).

    too bad since the article seemed to be very interesting.
  • omgBlur
    Quote:
    Quote:
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.


    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.


    Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.
  • rhysiam
    Quote:
    So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.

    But it was a blind test. The rigs were obscured. It sounds like at least one participant guessed the amd rig based on heat or fan noise, but otherwise they could judge only based on what they saw on screen.
  • cats_Paw
    Nevermind, I missunderstood that statement :D.
  • Jake Hall
    YEAH-HA! Nvidia, Bitches!!!
  • Vlad Rose
    Anonymous said:
    Quote:
    So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.

    But it was a blind test. The rigs were obscured. It sounds like at least one participant guessed the amd rig based on heat or fan noise, but otherwise they could judge only based on what they saw on screen.


    Actually it was 9: "Right off the bat, we found it interesting that 10 of 48 respondents believed they knew which system was which. Of those 10, nine were correct, though for a variety of reasons." . Around 20%
  • molo9000
    Comparing different monitors with different specs and different prices doesn't tell you anything about the technology. This is just a comparison of different monitors.
  • jasonelmore
    Quote:
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.


    These variable sync monitors excel at lower framerates.. Almost everyone uses a Single GPU setup. Even a GTX 980Ti, will dip below 60 FPS on some games..

    On a regular monitor, lets say a game runs 60 FPS, and then a big fight scene happens, lowering the frame-rate to 57 FPS.. on a regular monitor, it will drop it down to 30 FPS instead of 57 FPS, since the monitor has two modes 30hz/FPS, or 60hz/FPS

    on a gsync or freesync monitor, it drops the monitor refresh rate to 57hz, which lets you get 57 FPS.

    your argument is all about very high frame rate. even with a very high 144hz rate,, it still needs to be able to do 143hz, 142hz 141hz and so on. it cant do those lower refresh rates without Gsync or Freesync.

    you can buy a additonal GPU to try and keep it at 144hz or 144 FPS, but money is more wisely spent on a Gsync or Freesync monitor. uses less power, cost less money, and less heat
  • Achoo22
    Quote:
    With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.

    Sadly, I must completely agree. Please consult a statistician when setting up future experiments.
  • cegasaturn
    As an event attendee, it's great to be able to read the results! Any chance you can tell us which set of stations ran Battlefield at different quality settings?
  • clonazepam
    As a follow up, I'd like all frame pacing data put under the microscope. Do we see particular game engines / developers stand out? Do later versions of the engines improve or worsen the situation?

    Is it a software problem that hardware is trying to solve? Can it be solved in software with better development? Will the appearance of a hardware solution encourage less effort to be put into the issue on the software side?
  • JackNaylorPE
    1. First off, it was interesting that nVidia matched the 970 against the 390x recognizing that the difference in overclocking headroom between the 2 cards makes out of the box performance comparisons futile.

    2. One thing that should have been made clear if the BenQ XL2730Z was factort updated or a fresh off the lime model with the firmware update or was it the original with broken Freesync.

    http://www.tftcentral.co.uk/reviews/benq_xl2730z.htm

    "From a monitor point of view the use of FreeSync creates a problem at the moment on the XL2730Z at the moment. The issue is that the AMA setting does nothing when you connect the screen over DisplayPort to a FreeSync system. This applies whether you are actually using FreeSync or not, you don't even need to have the option ticked in the graphics card settings for the problem to occur. As a result, the setting appears to be in the off state, and changing it to High or Premium in the menu makes no difference to real-World response times or performance. As a result, response times are fairly slow at ~8.5ms G2G and there is a more noticeable blur to the moving image. See the more detailed response time tests in the previous sections for more information, but needless to say this is not the optimum AMA (response time) setting on this screen. For some reason, the combination of FreeSync support and this display disables the AMA function.

    Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on."

    This was fixed as of June 1st but many monitors were purchased or where still in channel when this occurred.

    3. While G-Sync / FreeSync come in handy in the 30 - 70 fps range, the key for me is what happens when you have 80 or 100+ ... at this point you would turn off G-Sync and use ULMB but how does ULMB compare with Freesync ?