Benchmarking Battlefield 1 in DirectX 12
Battlefield 1 launched in October of 2016, so yeah, you could say our analysis of the game’s performance is fashionably late. But we’re making up for it by including copious amounts of data. How thorough did we get?
Well, we tested 16 different graphics cards across two resolutions on a mainstream gaming platform using all four of Battlefield 1’s quality presets, then we tested 19 more cards on a high-end gaming platform at three resolutions, again, using all four of Battlefield 1’s quality settings. To wrap up, we benchmarked five host processing configurations with two, four, six, eight, and 10 cores at three resolutions to compare CPU scaling. All-told, we have 267 runs of the same sequence charted in various ways.
Battlefield 1: A Brief Recap
At this point, Battlefield 1 is a mature AAA title. DICE even released the game’s first major expansion, They Shall Not Pass, last month. Believe it or not, BF1 is considered the 15th installment in a franchise dating back to 2002’s Battlefield 1942. All but three were available for the PC, and excluding that trio, we’ve seen Battlefield evolve across five versions of the Refractor and Frostbite game engines. Frostbite 3.0, upon which Battlefield 1 was built, previously powered Battlefield 4 and Battlefield Hardline. Naturally, then, DirectX 12 is supported, and that’s where we focus our testing efforts.
EA’s minimum and recommended system requirements are fairly stout for a game with such mass appeal. In fact, they look a lot like the requirements for Mass Effect: Andromeda, also built on the Frostbite 3 engine.
Unfortunately, EA’s stance on multi-GPU support is that those configurations are not officially supported in Battlefield 1. This has become a point of contention as of late, since two big game updates apparently caused problems for owners of CrossFire- and SLI-equipped rigs in DX11 (neither technology works in any way under DX12). As of this writing, you’re best off with a fast single-GPU setup in this game, regardless of the API you choose to use (though AMD did just release its Radeon Software Crimson ReLive Edition 17.4.2, which supposedly addresses CrossFire scaling in Battlefield).
The first screen you’re presented with upon clicking More -> Options -> Video includes settings for screen mode, device, resolution, brightness, vertical sync, field of view, motion blur, weapon depth of field, and colorblind gamers. As far as our testing goes, we toggle between 1920x1080, 2560x1440, 3840x2160, and leave v-sync disabled. Everything else remains default.
All of the quality-oriented options live over on the Advanced tab. There, you can toggle DirectX 12 support on or off, alter the resolution scale, UI scale factor, and maximum frame rate. There’s a GPU Memory Restriction setting to keep the game from using more RAM than your graphics card actually has and a Graphics Quality preset selection field where individual quality settings are specified. The five options are Low, Medium, High, Ultra, and Custom.
Low turns Texture Quality, Texture Filtering, Lighting Quality, Effects Quality, Post Process Quality, Mesh Quality, Terrain Quality, and Undergrowth Quality to Low, and disables Antialiasing Post and Ambient Occlusion.
The Medium preset dials all of those options up one notch, also setting Antialiasing Post to TAA (temporal anti-aliasing) and Ambient Occlusion to HBAO (horizon-based ambient occlusion). TAA goes a long way in eliminating the nasty shimmer artifacts that affect objects like barbed wire, so the feature is recommended whenever your horsepower budget allows for it.
High bumps each setting up once more, maintaining TAA and HBAO.
Ultra does the same, and again leaves TAA and HBAO active.
How We Test Battlefield 1
This performance exploration is much more in-depth than the game coverage we typically try to publish days after a new title launches. It involves 29 unique graphics cards and two distinct platforms.
For our mainstream platform, we wanted to get as close to Battlefield 1’s minimum requirements as possible. Originally we had an FX-4350 installed, but swapped it out in favor of the lower-frequency FX-8320 (apparently our old six-core chips are no longer where they’re supposed to be). Eight gigabytes of DDR3-1333 from G.Skill on MSI’s 990FXA-GD80 motherboard is right in line with EA’s lowest spec. Moreover, Windows 10 Pro is necessary for testing under DirectX 12.
The higher-end platform needed to be powerful, but not budget-breakingly so. A Core i7-6700K on MSI’s Z170A Gaming M7 with 16GB of G.Skill DDR4-2133 is plenty fast to illustrate any differences between mid-range and enthusiast-oriented graphics hardware. Ryzen wasn't ready yet when testing commenced, so we miss out on AMD's latest and greatest. As you'll see shortly, though, the quality presets gamers really want to use are predominantly GPU-bound anyway.
Of course, then there are the graphics cards. Notably missing, of course, is GeForce GTX 1080 Ti, which also wasn't out when our data was collected. Titan (Pascal) comes close enough to that board's performance, though.
|1st-Gen GCNRadeon R9 270 2GBRadeon R9 280X 3GB||KeplerGeForce GTX 760 2GBGeForce GTX 770 2GBGeForce GTX 780 3GBGeForce GTX 780 Ti 3GBGeForce GTX Titan 6GB|
|2nd-Gen GCNRadeon HD 7790 2GBRadeon R9 290 4GBRadeon R9 290X 4GBRadeon R9 390 8GBRadeon R9 390X 8GB||MaxwellGeForce GTX 950 2GBGeForce GTX 960 2GBGeForce GTX 970 4GBGeForce GTX 980 4GBGeForce GTX 980 Ti 6GBGeForce GTX Titan X 12GB|
|3rd-Gen GCNRadeon R9 380 4GBRadeon R9 Fury 4GBRadeon R9 Fury X 4GB||PascalGeForce GTX 1050 Ti 4GBGeForce GTX 1060 6GBGeForce GTX 1070 8GBGeForce GTX 1080 8GBTitan X 12GB|
|4th-Gen GCNRadeon RX 460 4GBRadeon RX 470 4GBRadeon RX 480 8GB||Row 4 - Cell 1|
There is no built-in benchmark, so we had to find a sequence that could be reproduced hundreds of times without much risk of death. The opening sequence from Episode 4, O La Vittoria, gives us 80 seconds between the second artillery piece firing and reaching the barbed wire fence to collect data. Performance is captured using the tools detailed in PresentMon: Performance In DirectX, OpenGL, And Vulkan. Check out the complete sequence below:
Bear in mind that this is but one slice of action from a long and varied single-player campaign. Moreover, the multi-player experience is much more frenetic, and based on what we’ve seen from Battlefield games in the past, we know it makes thorough use of fast multi-core CPUs.
MORE: Best Graphics Cards
MORE: All Graphics Content
Current page: Benchmarking Battlefield 1 in DirectX 12Next Page Mainstream PC, 1920x1080
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Wow. The amount of work in putting this together. Thanks, from all the BF1 gamers out there. You knocked my socks off, and are pushing me to upgrade my GPU.Reply
Nice article. Would have been interesting to see the 1080ti and the Ryzen 1800x mixed in there somewhere. I have a 7700k and a 980ti it would be good info to get some direction on where to take my hardware next. I'm sure other people might find that interesting too.Reply
Good job, just remember that these "GPU showdowns" don't tell the whole story b/c cards are running at Stock, and there are GPU's that can get huge overclocks thus performing significantly better.Reply
Case in point: GTX 780TI
The 780TI featured here runs at stock which was 875 MHz Base Clock and 928 MHz Boost Clock, whereas the 3rd party GPU's produced ran at 1150 MHz and boosted up to 1250-1300 MHz. We are talking about 30-35% more performance here for this card which you ain't seeing here at all.
Great write up, just a shame you didnt use any i5 CPUS, i would of really liked to se how an i5 6600k competes with its 4 cores agains the HT i7sReply
Wow, impressive results from AMD here. You can really see that Radeon FineWine™ tech in action.Reply
And then you run in DX11 mode and it runs faster than DX12 across the board. Thanks for effort you put in this but rather pointless since DX12 has been nothing but pile of crap.Reply
Why do my 680oc 2gb sli run this at 100hz 3440x1440? 2133 gskill, 4770k 4.2gzReply
@XIZEL My i5 6600k @4.6ghz and rx 480 get 80-90 fps max settings on all 32 v 32 multiplayer maps with very few spikes either up or down.Reply
780Ti below a R9-290 3 years down the road ...Reply
Fascinating stuff! Love that you are still including the older models in your benchmarks, makes for great info for a budget gamer like myself! In fact, this may help me determine what goes in my budget build I'm working on right now, which I was going to have dual 290x (preferably 8gb if I can find them), but now might have something else.Reply