Battlefield 1 Performance In DirectX 12: 29 Cards Tested

Battlefield 1 launched in October of 2016, so yeah, you could say our analysis of the game’s performance is fashionably late. But we’re making up for it by including copious amounts of data. How thorough did we get?

Well, we tested 16 different graphics cards across two resolutions on a mainstream gaming platform using all four of Battlefield 1’s quality presets, then we tested 19 more cards on a high-end gaming platform at three resolutions, again, using all four of Battlefield 1’s quality settings. To wrap up, we benchmarked five host processing configurations with two, four, six, eight, and 10 cores at three resolutions to compare CPU scaling. All-told, we have 267 runs of the same sequence charted in various ways.

Battlefield 1: A Brief Recap

At this point, Battlefield 1 is a mature AAA title. DICE even released the game’s first major expansion, They Shall Not Pass, last month. Believe it or not, BF1 is considered the 15th installment in a franchise dating back to 2002’s Battlefield 1942. All but three were available for the PC, and excluding that trio, we’ve seen Battlefield evolve across five versions of the Refractor and Frostbite game engines. Frostbite 3.0, upon which Battlefield 1 was built, previously powered Battlefield 4 and Battlefield Hardline. Naturally, then, DirectX 12 is supported, and that’s where we focus our testing efforts.

EA’s minimum and recommended system requirements are fairly stout for a game with such mass appeal. In fact, they look a lot like the requirements for Mass Effect: Andromeda, also built on the Frostbite 3 engine.

Minimum Configuration

Processor
Memory
8GB
Graphics Card
Operating System
Windows 7, 8.1, 10 (64-bit only)
Disk Space
50GB
Online
512 Kb/s or faster Internet connection

Recommended Configuration

Processor
Memory
16GB
Graphics Card
Operating System
Disk Space
50GB
Online
512 Kb/s or faster Internet connection

Unfortunately, EA’s stance on multi-GPU support is that those configurations are not officially supported in Battlefield 1. This has become a point of contention as of late, since two big game updates apparently caused problems for owners of CrossFire- and SLI-equipped rigs in DX11 (neither technology works in any way under DX12). As of this writing, you’re best off with a fast single-GPU setup in this game, regardless of the API you choose to use (though AMD did just release its Radeon Software Crimson ReLive Edition 17.4.2, which supposedly addresses CrossFire scaling in Battlefield).

Graphics Settings

The first screen you’re presented with upon clicking More -> Options -> Video includes settings for screen mode, device, resolution, brightness, vertical sync, field of view, motion blur, weapon depth of field, and colorblind gamers. As far as our testing goes, we toggle between 1920x1080, 2560x1440, 3840x2160, and leave v-sync disabled. Everything else remains default.

All of the quality-oriented options live over on the Advanced tab. There, you can toggle DirectX 12 support on or off, alter the resolution scale, UI scale factor, and maximum frame rate. There’s a GPU Memory Restriction setting to keep the game from using more RAM than your graphics card actually has and a Graphics Quality preset selection field where individual quality settings are specified. The five options are Low, Medium, High, Ultra, and Custom.

Low turns Texture Quality, Texture Filtering, Lighting Quality, Effects Quality, Post Process Quality, Mesh Quality, Terrain Quality, and Undergrowth Quality to Low, and disables Antialiasing Post and Ambient Occlusion.

The Medium preset dials all of those options up one notch, also setting Antialiasing Post to TAA (temporal anti-aliasing) and Ambient Occlusion to HBAO (horizon-based ambient occlusion). TAA goes a long way in eliminating the nasty shimmer artifacts that affect objects like barbed wire, so the feature is recommended whenever your horsepower budget allows for it.

High bumps each setting up once more, maintaining TAA and HBAO.

Ultra does the same, and again leaves TAA and HBAO active.

How We Test Battlefield 1

This performance exploration is much more in-depth than the game coverage we typically try to publish days after a new title launches. It involves 29 unique graphics cards and two distinct platforms.

For our mainstream platform, we wanted to get as close to Battlefield 1’s minimum requirements as possible. Originally we had an FX-4350 installed, but swapped it out in favor of the lower-frequency FX-8320 (apparently our old six-core chips are no longer where they’re supposed to be). Eight gigabytes of DDR3-1333 from G.Skill on MSI’s 990FXA-GD80 motherboard is right in line with EA’s lowest spec. Moreover, Windows 10 Pro is necessary for testing under DirectX 12.

The higher-end platform needed to be powerful, but not budget-breakingly so. A Core i7-6700K on MSI’s Z170A Gaming M7 with 16GB of G.Skill DDR4-2133 is plenty fast to illustrate any differences between mid-range and enthusiast-oriented graphics hardware. Ryzen wasn't ready yet when testing commenced, so we miss out on AMD's latest and greatest. As you'll see shortly, though, the quality presets gamers really want to use are predominantly GPU-bound anyway.

Of course, then there are the graphics cards. Notably missing, of course, is GeForce GTX 1080 Ti, which also wasn't out when our data was collected. Titan (Pascal) comes close enough to that board's performance, though.

AMD
Nvidia

1st-Gen GCN

  • Radeon R9 270 2GB
  • Radeon R9 280X 3GB

Kepler

  • GeForce GTX 760 2GB
  • GeForce GTX 770 2GB
  • GeForce GTX 780 3GB
  • GeForce GTX 780 Ti 3GB
  • GeForce GTX Titan 6GB

2nd-Gen GCN

  • Radeon HD 7790 2GB
  • Radeon R9 290 4GB
  • Radeon R9 290X 4GB
  • Radeon R9 390 8GB
  • Radeon R9 390X 8GB

Maxwell

  • GeForce GTX 950 2GB
  • GeForce GTX 960 2GB
  • GeForce GTX 970 4GB
  • GeForce GTX 980 4GB
  • GeForce GTX 980 Ti 6GB
  • GeForce GTX Titan X 12GB

3rd-Gen GCN

  • Radeon R9 380 4GB
  • Radeon R9 Fury 4GB
  • Radeon R9 Fury X 4GB

Pascal

  • GeForce GTX 1050 Ti 4GB
  • GeForce GTX 1060 6GB
  • GeForce GTX 1070 8GB
  • GeForce GTX 1080 8GB
  • Titan X 12GB

4th-Gen GCN

  • Radeon RX 460 4GB
  • Radeon RX 470 4GB
  • Radeon RX 480 8GB

There is no built-in benchmark, so we had to find a sequence that could be reproduced hundreds of times without much risk of death. The opening sequence from Episode 4, O La Vittoria, gives us 80 seconds between the second artillery piece firing and reaching the barbed wire fence to collect data. Performance is captured using the tools detailed in PresentMon: Performance In DirectX, OpenGL, And Vulkan. Check out the complete sequence below:

Battlefield 1 Test Sequence

Bear in mind that this is but one slice of action from a long and varied single-player campaign. Moreover, the multi-player experience is much more frenetic, and based on what we’ve seen from Battlefield games in the past, we know it makes thorough use of fast multi-core CPUs.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
47 comments
    Your comment
  • envy14tpe
    Wow. The amount of work in putting this together. Thanks, from all the BF1 gamers out there. You knocked my socks off, and are pushing me to upgrade my GPU.
    9
  • computerguy72
    Nice article. Would have been interesting to see the 1080ti and the Ryzen 1800x mixed in there somewhere. I have a 7700k and a 980ti it would be good info to get some direction on where to take my hardware next. I'm sure other people might find that interesting too.
    0
  • Achaios
    Good job, just remember that these "GPU showdowns" don't tell the whole story b/c cards are running at Stock, and there are GPU's that can get huge overclocks thus performing significantly better.

    Case in point: GTX 780TI

    The 780TI featured here runs at stock which was 875 MHz Base Clock and 928 MHz Boost Clock, whereas the 3rd party GPU's produced ran at 1150 MHz and boosted up to 1250-1300 MHz. We are talking about 30-35% more performance here for this card which you ain't seeing here at all.
    -3
  • xizel
    Great write up, just a shame you didnt use any i5 CPUS, i would of really liked to se how an i5 6600k competes with its 4 cores agains the HT i7s
    4
  • Verrin
    Wow, impressive results from AMD here. You can really see that Radeon FineWine­™ tech in action.
    6
  • Anonymous
    And then you run in DX11 mode and it runs faster than DX12 across the board. Thanks for effort you put in this but rather pointless since DX12 has been nothing but pile of crap.
    -4
  • pokeman
    Why do my 680oc 2gb sli run this at 100hz 3440x1440? 2133 gskill, 4770k 4.2gz
    1
  • NewbieGeek
    @XIZEL My i5 6600k @4.6ghz and rx 480 get 80-90 fps max settings on all 32 v 32 multiplayer maps with very few spikes either up or down.
    0
  • ohim
    780Ti below a R9-290 3 years down the road ...
    0
  • Jupiter-18
    Fascinating stuff! Love that you are still including the older models in your benchmarks, makes for great info for a budget gamer like myself! In fact, this may help me determine what goes in my budget build I'm working on right now, which I was going to have dual 290x (preferably 8gb if I can find them), but now might have something else.
    1
  • dfg555
    It's funny how at 4K the Fury and Fury X are able to match 980 Ti speeds yet the game is utilizing way more than 4GB of VRAM. HBM doing its' wonders.

    Also what happened to Kepler, RIP 780 Ti vs 290X.
    2
  • Achaios
    Like I wrote above, the GTX 780TI they have here is running a stock which was 875/928 Mhz. A third party GTX 780TI such as the Gigabyte GTX 780TI GHz Edition that boosts to 1240 MHz, scores 13540 3D Mark Firestrike Graphics score, which is just 20 marks less or so than the R9 390X at 3D mark Firestrike performance results, and significantly faster than the R9 290X, R9 470, R9 480 and GTX 1060 6GB.
    http://imgur.com/KIP0MRt 3D mark Firestrike results here: https://www.futuremark.com/hardware/gpu
    -4
  • sunny420
    Awesome work folks! The data!!
    The only thing I felt was missing was as Xizel mentioned. It would have been great to see an I5 included in the Scaling: CPU Core Count chart. All of the I7s perform similarly, with only one I3 outlier for data points. It would have been nice to see a middle-of-the-road offering in one of the I5s.
    0
  • damric
    Should have kept Hyper Threading enabled on the i3 since the whole point of the i3's existence is Hyper Threading.

    1st gen GCN really pulled ahead of Kepler over the years.
    3
  • atomicWAR
    I would have liked to see some actually i5s in the mix. While I get disabling hyper-threading emulates them to a degree, I can't tell you how many posts in the forums I have responded to with i5s claiming/or troubleshooting that ended up with a CPU bottleneck in this game, especially in multiplayer. Folks running good GPUs (everything from RX 480s to GTX 1080s) getting 100% CPU usage or close (again the worst of it was in multiplayer). Ultimately I feel like this article says in-directly 4C/4T is enough when every day posts in the forums say the opposite. While I know you could never get a fully accurate benchmark in multiplayer, I would like to see an article on core scaling in multiplayer all the same. It would have to be more about the reviewers impression of how smooth game play is but doing some benchmarks that have CPU utilization and frame variance/ frame rate would be useful in helping those with i5s (or any 4C/4T CPU) figure out if their experience is being hindered or is typical compared to other 4C/4T users. This article had a ton of info but I feel it only scratched the surface of what gamers are dealing with in real life.
    5
  • Kawi6rr
    Wow a lot of work went into this very well done! Go AMD, I like how they get better with age lol. Looks like my next card will be the new 580's coming out.
    0
  • thefiend1
    Does increasing the quality settings help the color banding in the gradients of the smoke, fog, etc? On XB1 the banding is horrible in some cases and im curious if that issue is the same on PC.
    0
  • David_693
    Would have also liked to have seen the AMD 295x2 in the mix as well...
    0
  • MaCk0y
    Anonymous said:
    I would have liked to see some actually i5s in the mix. While I get disabling hyper-threading emulates them to a degree, I can't tell you how many posts in the forums I have responded to with i5s claiming/or troubleshooting that ended up with a CPU bottleneck in this game, especially in multiplayer. Folks running good GPUs (everything from RX 480s to GTX 1080s) getting 100% CPU usage or close (again the worst of it was in multiplayer). Ultimately I feel like this article says in-directly 4C/4T is enough when every day posts in the forums say the opposite. While I know you could never get a fully accurate benchmark in multiplayer, I would like to see an article on core scaling in multiplayer all the same. It would have to be more about the reviewers impression of how smooth game play is but doing some benchmarks that have CPU utilization and frame variance/ frame rate would be useful in helping those with i5s (or any 4C/4T CPU) figure out if their experience is being hindered or is typical compared to other 4C/4T users. This article had a ton of info but I feel it only scratched the surface of what gamers are dealing with in real life.


    I agree. My Core i5-4690k at 4.8GHz is between 95-100% usage in multiplayer.
    0
  • DerekA_C
    I get about 72% usage on my 4790k at 4.4ghz in multi 64 player server
    0