Any time we’ve ever discussed realism in first-person shooters, the Arma series comes up. After a lengthy alpha and beta period, Arma 3 finally went live earlier this month.
If you want to play this one at its highest settings using a 4K display’s native 3840x2160 resolution, you’re probably going to want two GeForce GTX Titans. We can imagine that three GeForce GTX 770s or 780s would work as well, though two tend to fall under the average frame rates we want to see.
Indeed, the 770s spend some time under 30 FPS in our simple run-through sequence, while 780s flirt with the 35 FPS mark at a number of points. It takes a couple of Titans to keep up above 40 FPS for most of the benchmark.
Our frame time variance calculation gives us the difference between the time it takes to a display a frame compared to the average of the 20 frames before and after, deliberately minimizing the impact of variance as the frame rate goes up or down due to game loads (this is natural), and instead trying to identify problem areas.
To give you an idea of how important this calculation is, average variance across our Arma 3 run is 46 ms on a GeForce GTX Titan. But exclusively comparing each frame to the 20 frames before and after it drops that number to .65 ms.
It makes sense that we would see the lowest frame time variance (and hence, the most consistent frame delivery) from a single-GPU configuration. Indeed, GeForce GTX Titan shows up at the top of our chart. Dual-GPU setups appear in the order of their performance; slower cards would indeed be expected to perform less consistently, even between successive frames.
- What Does It Take To Game At 3840x2160?
- How Do We Benchmark Graphics At 4K Resolutions?
- Results: Arma 3
- Results: Battlefield 3
- Results: BioShock Infinite
- Results: Crysis 3
- Results: Grid 2
- Results: The Elder Scrolls V: Skyrim
- Results: Tomb Raider
- 4K Gaming Is Here And Possible, But Are You Willing To Pay For It?