For Honor Performance Review

How We Test For Honor

Test Configuration

CPU

Motherboard
(LGA 1151)

RAM

System SSD

Controller

Intel PCH Z170 SATA 6 Gb/s

Power Supply

Case

OS

Operating System

Windows 10 x64 Enterprise 1607 (14393.693)

Drivers

Nvidia GeForce Game Ready 378.66
AMD Radeon Crimson Edition 17.2.1

We're using a fairly mid-range gaming PC that we hope represents what a lot of our readers are running. If you're lucky enough to own faster hardware, you'll naturally enjoy higher frame rates. Steam's survey of hardware and software configurations offers us a view of the most prevalent components and settings (the data comes from January 2017):

  • Windows 10 64-bit, representing 48.5 % of the market.
  • 8GB RAM, present in 34% of surveyed gaming PCs (our configuration has 16GB).
  • Full HD (1920x1080 pixels) is used by 39% of gamers, while 25 % are still at 1366x768. QHD (2560x1440 pixels) is used by less than 2% of gamers, and 4K is still anecdotal.
  • Quad-core CPUs are installed in almost half of the examined systems (47% more precisely). Logically, our configuration is loaded with a mid-range quad-core Intel CPU.

Graphics Card Choices

We picked six graphics cards to test. They're mostly mid-range, and they represent popular choices from the current and previous generation of architectures. Here are the competing cards:


The Radeon RX 480 “Core” from XFX is at a disadvantage, given its stock clock rate, compared to the Asus Strix OC and its GPU frequency of 1645 MHz. We're overclocking it +4% to the level of a factory-tuned model, yielding a 1340 MHz GPU and 2 GHz memory.


These two cards represent where the mid-range segment starts. The Radeon RX 470 should have an advantage with its additional gigabyte of memory. Though, in light of For Honor's forgiving requirements, the extra capacity may not make a difference at 1920x1080.


Nvidia's GeForce GTX 970 and AMD's Radeon R9 390 are previous-gen cards, but they'll no doubt remain popular in mid-range gaming PCs for months to come.

Test Procedure

All performance data is collected using the PresentMon tool and our own custom front-end.

In order to represent graphics card performance accurately, each test subject is warmed up to a stable temperature before measurements are collected. Most GPUs employ mechanisms to optimize clock rates based on variables like power and temperature. So, tests run during the warm-up period would convey better performance than you'd see in the real world.

We therefore run the benchmark sequence twice to warm up each card. Then we gather the data for our charts. For graphics options, we're testing at 1080p, 1440p, and 4K using the High and Extreme quality presets.

MORE: Best Gaming Laptops

MORE: Best Gaming Monitors

This thread is closed for comments
18 comments
    Your comment
  • Sakkura
    While it is nice to see how last-gen cards like the 970 and 390 compare to current-gen cards, I think it would have been nice to see cards outside this performance category tested. How do the RX 460 and GTX 1050 do at 1080p High, for example.
  • coolitic
    I find it incredibly stupid that Toms doesn't make the obvious point that TAA is the reason why it looks blurry for Nvidia.
  • alextheblue
    746565 said:
    I find it incredibly stupid that Toms doesn't make the obvious point that TAA is the reason why it looks blurry for Nvidia.

    They had it enabled on both cards. Are you telling me Nvidia's TAA implementation is inferior? I'd believe you if you told me that, but you have to use your words.
  • Yannick_5
    Interesting test, thanks for your report.
  • irish_adam
    746565 said:
    I find it incredibly stupid that Toms doesn't make the obvious point that TAA is the reason why it looks blurry for Nvidia.


    as has been stated setting were the same for both cards, maybe Nvidia sacrificing detail for the higher FPS score? I also found it interesting that Nvidia used more system resources than AMD. I was confused why the 3GB version of the 1060 did so poorly considering the lack vram usage but after looking it up nvidia gimped the card, seems a bit misleading for them to both be called the 1060, if you didnt look it up you would assume they are the same graphics chip with just differing amounts of ram.
  • anthony8989
    ^ yeah it's a pretty shifty move . Probably marketing related. The 1060 3 GB and 6 GB have a similar relationship to the GTX 660 and 660 TI from a few gens back. Only they were nice enough to differentiate the two with the TI moniker. Not so much this time.
  • ElectrO_90
    Maybe it's my eyes, but the Nvidia pictures look blurrier than the AMD.
  • cmi86
    Still using reference rx-480's... Nice...
  • Martell1977
    359683 said:
    I was confused why the 3GB version of the 1060 did so poorly considering the lack vram usage but after looking it up nvidia gimped the card, seems a bit misleading for them to both be called the 1060, if you didnt look it up you would assume they are the same graphics chip with just differing amounts of ram.


    I've been calling this a dirty trick since release and is part of the reason I recommend people get the 480 4gb instead. Just a shady move on nVidia's part, but they are known for such things.

    Odd thing though, the 1060 3gb in laptops doesn't have a cut down chip, it has all its cores enabled and is just running lower clocks and VRAM.
  • ohim
    Nvidia`s screen looks like a lowered res upscaled image,,,
  • iam2thecrowe
    blurrier and using less vram for the Nvidia cards..... I'd probably go away from the presets and try to figure out whats wrong, there's not direct performance comparison with different settings. Nvidia's AA techniques lately seem to often produce a blurry image.
  • bramahon
    Seems like an optimized and scalable game where performance is in-line with the visuals. Not something we can say about most of the ports these days!
  • xerxes-the king of kings
    It is nothing to do with nvidia and related to amount of graphics memory like RX 460 2GB poor texture quality in Gears of War 4
  • anbello262
    Maybe the picture quality is just some kind of driver related issue for nVidia? There is too much difference, if it's just different algorithms.
  • tomohiko
    a bit small "Memory Usage" on "High" than "Extreme High" ?
    I am only guessing but
    Switching ON a specific setting may
    allow CPU/GPU to cut some calculation
    for shadows or rendering
    behind the specific well-placed Objects. (it is unintentional and very situational).
  • tomohiko
    Small memory usage on "Extreme High" ??
    I am only guessing but.
    Switching "ON" specific settings may allow CPU/GPU
    to cut some calculations for shadows or rendering
    Behind the "well-placed" objects. (un-intentional, very situational)
  • Olle P
    To me it's quite obvious that the differences in CPU and RAM usage comes from the frame rate.
    Since the GTX 1060 can produce more frames the CPU needs to work harder to keep up with the demand, and while doing so it also use more RAM to store intermediate data.
  • Martell1977
    362640 said:
    To me it's quite obvious that the differences in CPU and RAM usage comes from the frame rate. Since the GTX 1060 can produce more frames the CPU needs to work harder to keep up with the demand, and while doing so it also use more RAM to store intermediate data.


    Higher frames but lower quality. Something is off, I'm starting to question nVidia's performance numbers.Who cares if you are getting 100 frames if the image is garbage.