At 1920x1080, the four fastest AMD graphics cards were bunched into a <10 FPS group. But 2560x1440 is taxing enough to spread the field.
Radeon RX Vega 64 is about 14% faster than R9 Fury X, which is almost 30% faster than R9 290X. In turn, the Hawaii-based card is more than 31% faster than Tahiti. AMD’s first GCN-based board establishes an astounding 68% lead over Cayman. That kind of generational scaling is typically unheard of.
Similarly unbelievable is the fact that, 10 years after Crysis’ release, AMD’s fastest graphics card (with 18.7x as many transistors as the RV670 XT available in 2007) cannot break an average frame rate of 100 FPS at 2560x1440. It's about 11x faster than Radeon HD 3870, if that's any consolation.
A higher resolution has the same effect on Nvidia’s line-up. GeForce GTX 1080 Ti approaches a 20% lead over 980 Ti, which is 34% faster than 780 Ti. That card’s almost-47%-higher average than GTX 680 gets us back to 2012—and perhaps the slowest card you’d want to use for playable frame rates using Crysis’ Very High quality preset.
Now that we’re beyond platform/software bottlenecks, let’s see what anti-aliasing does to performance at 2560x1440.
8x Anti-Aliasing Results
It takes a Radeon R9 290X to average ~60 FPS in Crysis at 2560x1440 with 8xAA. And that’s an ~11% hit compared to the frame rates we recorded without anti-aliasing.
But the most interesting event, by far, is Radeon HD 7970’s victory over 6970. That shift from TeraScale 3 to GCN had major implications for AMD. GeForce GTX 680 might have absorbed some of the Tahiti GPU’s limelight at the time. But in retrospect, it was architectural milestone.
Nvidia’s GeForce GTX 1080 Ti maintains an average in excess of 110 FPS with 8xAA applied, hardly skipping a beat compared to its result without anti-aliasing.
Lower-end cards take a more pronounced hit: GeForce GTX 980 Ti achieves roughly 83% of its AA-free performance, while the 780 Ti manages ~77% of its average frame rate compared to 2560x1440 with no anti-aliasing. The loss is even worse for GeForce GTX 680, which slows down so much that you wouldn’t even want to run at these settings.
MORE: Best Graphics Cards
MORE: Desktop GPU Performance Hierarchy Table
MORE: All Graphics Content
was it just poorly optimized?
It was a product of the times where developers were still trying to push the envelope for cutting edge graphical techniques.. Pretty cool