Results: The Elder Scrolls V: Skyrim
Skyrim tends to be more platform-bound than most of our other benchmarks, so an overclocked Ivy Bridge-E-based configuration with lots of fast memory lets these cards perform to their peak potential using the Ultra detail preset.
The thing is, this game just doesn’t tax graphics hardware very much. You’ll still find it playable at 2560x1440, even on a Bonaire-powered Radeon HD 7790 or R7 260X. Most notable, perhaps, is that a $200 R9 270X trades blows with a $250 GeForce GTX 760.
Smooth frame rate over time line graphics demonstrate an entire field of playable performance at 1920x1080, and mostly ample numbers at 2560x1440 using the game’s Ultra quality preset.
The GeForce cards experience higher frame time variance, on average. At 1920x1080, only the 650 Ti’s worst-case result is something you’d likely notice. At 2560x1440, however, the numbers using Nvidia’s latest beta drivers aren’t as good. Again, it’s the GeForce GTX 650 Ti that demonstrates the least-favorable behavior.
The MSI R9 280X Gaming at $299 appears to outperform the GTX 770 at 1600P and is within margin of error at 1080P according to Techpowerup. Not a bad value at $100 less and still overclocks well:
Best to hold out till the reviews on the R9-290X I guess. But considering the specs I hope for at least 20% performance increases over a 7970.
Are the days of (nearly) annual simultaneous full line GPU launches from $100-500 with a dual GPU chip to follow at $750-1000 really over?
I wrote one of the least flattering GTX 780 stories out there. I only identified a couple of situations where a Titan made any sense at all. And although the 760 *did* change the balance at $250, that card still didn't get an award. I liked the 770 for the simple fact that it delivered better-than-680 performance for close to $100 less.
The rest of AMD's new line-up is a lot like what exists already. Again, the 7870 is a better value than 270X. So what are you getting worked up over? The fact that I'm pointing out these aren't new GPUs? They're not. ;)
That goes to you too Mr. NVIDIA