Results: Arma III
I wanted to cut down on the page count of this story, so all of the re-run benchmarks are piling into one chart with three resolutions. Again, everything you see in the next seven pages is the product of heating every graphics card up prior to testing.
Right out of the gate, at 1920x1080, Radeon R9 290 jumps up alongside our press-sampled R9 290X at 1920x1080, 25601440, and the unplayable 3840x2160. Of course, achieving this requires a more aggressive 47% fan speed ceiling, which isn’t as bad as the 290X’s Uber mode, but still significantly louder than Quiet mode.
Meanwhile, the R9 290X we bought off the shelf starts under the $330 GeForce GTX 770 and $400 R9 290. Now you see why we’re making such a big deal about the variance between boards, right?
Fortunately for AMD, the shift to 2560x1440, where we’d expect these products to be used, shakes up the standings. The press-sampled R9 290 finishes in front of the GeForce GTX 780, and indeed the Titan as well. It continues to barely trail the 290X card we received from AMD, too. But then there’s the retail 290X, which manages to tie the $500 GeForce GTX 780, but loses to the 290 it should be beating.
By the time we hit 3840x2160, all of these cards are running too slowly for playable performance. You’d need to back Arma III off of its Ultra graphics quality setting—and after spending $3500 on a monitor, you aren’t going to want to do that.
The frame rate over time charts demonstrate just how close Radeon R9 290 and 290X come to each other—at least the cards we were sent by AMD. Our retail board is consistently in a different (lower) class.
Nvidia’s cards have an issue with Arma at 1920x1080—we cannot FCAT their results without a ton of frames getting inserted into the video output. Charted out, these insertions are what mess with worst-case frame time variance at that resolution.
At 2560x1440, every card drops back to very low variance, which is what we want to see to confirm that there’s little in the way of stuttering going on.
Stepping up to Ultra HD, however, frame rates drop so low, and the workload is so demanding, that variance between frames grows substantially.