How We Tested
How big a sensation is Fortnite at the moment? It's addicting the NBA. It's setting records for concurrent numbers of players. It's making our colleagues lose more sleep than they'd likely care to admit. (Wake up, buddy.)
Understatement of the year: Fortnite has enjoyed a whole lot of success since Epic Games added a free-to-play Battle Royale mode last September, building on the already available Save the World mode (PvE). (For more on Fortnite: Battle Royale, see this primer on sister site Tom's Guide.)
The game is available on the Sony PlayStation 4, Microsoft Xbox One, Windows, and macOS platforms. It leverages the Unreal Engine 4, and it is DirectX 11-compatible (Metal, if you're on macOS). And indeed, on the PC side of things, it's designed to run even on low-end hardware. Sister site Laptop did a keen analysis of how to play Fortnite on the integrated graphics chips that are common fare in most non-gaming-minded laptops and low-end desktop PCs.
Check out that story, if your intention is to play Fortnite on hardware like that. Under those circumstances, it becomes a balancing act of tweaking settings and resolutions. Our intention here, instead, is to see how Fortnite plays on a mainstream PC with a variety of dedicated graphics solutions, and what it takes to run the game at maximum settings, which in the case of Fortnite is known as "Epic."
The Benchmark Sequence
The trick to testing Fortnite on any kind of hardware is settling on a test sequence that will generate meaningful, repeatable results. It is always tough to choose a sequence for games that don't include an integrated benchmarking routine. For the purposes of our performance analysis, this time around we created an easily reproducible walk-through within the mission “Before and After Science,” found in Save the World mode. A recording of the exact sequence is shown below...
Minimum & Recommended System Requirements
Fortnite's minimum and recommended system configurations are available directly from Epic's webpage. Whether you're talking about host processing, system memory, or graphics horsepower, Fortnite doesn't seem particularly demanding. Of course, it remains to be seen if our mainstream gaming PC is enough to enable the Epic quality preset at 1920x1080...
|Processor||Intel Core i3 (2.4GHz)||Intel Core i5 (2.8GHz)|
|Graphics||Intel HD Graphics 4000||Nvidia GeForce GTX 660AMD Radeon HD 7870|
|Operating System||Windows 7, 8.1, or 10 (64-bit)||Windows 7, 8.1, or 10 (64-bit)|
|Operating System||Windows 10 x64 Pro 1709 (16299.248)|
|Graphics Drivers||The game was tested using the latest public drivers available at the time we ran our benchmarks:Nvidia GeForce Game Ready 391.24AMD Radeon Adrenalin Edition 18.3.3|
|Game Version||The most up-to-date version of the game was tested at the time we ran our benchmarks:Fortnite v3.3 (3948073)|
We recently updated our test configuration to better reflect the state of mid-range PC gaming on a desktop in 2018. This time around, we picked an AMD Ryzen-based platform, focusing specifically on the Ryzen 5 1600X as a great CPU compromise option for enthusiasts looking to save some money.
Steam's survey of hardware and software configurations offers us a view of the most prevalent components and settings (the data comes from February 2018):
- 8GB of RAM is the memory level in 45% of gaming PCs. Our system has 16GB, similar to almost 40% of surveyed gamers.
- Full HD resolution is used by 76% of gamers, while 8% are still running at 1366x768. QHD is the resolution of choice of only 3.4% of respondents, while 4K remains a small enough percentage of adoption to remain anecdotal. We will test at 1440p in addition to the classic Full HD.
- Quad-core CPUs are installed in more than two-thirds of surveyed systems (72% to be exact). In anticipation of CPU trends in coming months, we're using a mid-range six-core processor.
Graphics Card Selection
We chose 10 representative graphics cards for this test, mainly entry-level and mainstream options. Here are the competing boards:
All performance data is collected using the PresentMon tool and our own custom front end.
To represent performance accurately, each graphics card is warmed up to a stable temperature before measurements are collected. Most newer GPUs employ mechanisms to optimize clock rates based on variables such as power consumption and temperature. As a result, tests run during the warm-up period would convey better performance than you'd see in the real world. We therefore execute the benchmark sequence one time to warm up the card, prior to gathering official data. For graphics settings, we tested the game at Full HD and QHD resolutions, with the maximum settings (Epic preset) and with the Show Grass setting turned on.