
For each game we’re testing, we need to evaluate three different products. First up is AMD’s “new” R9 280X. As expected, it’s slower than the Radeon HD 7970 GHz Edition, though just slightly. Nvidia’s closest-priced alternative, GeForce GTX 760, sells for $50 less, but even gets beaten by the R9 270X in Arma III. The cheapest 7970 GHz Edition card (as of this writing) sells for $330, so for $30 less, the R9 280X is a good example of AMD’s Tahiti GPU made more attractive.
Stepping down one product category means giving up playable performance at 2560x1440 (at least using Very High quality settings). Nevertheless, AMD’s R9 270X has little trouble outpacing GeForce GTX 760 and the Radeon HD 7870. AMD scores a value win, without question. But with 7870s going for as little as $170, spending $30 more on an R9 270X is a step in the wrong direction, price-wise, for the same Pitcairn/Curacao GPU.
Arma is a great-looking title, and its Very High detail setting is pretty taxing. An average frame rate in the 30s might not be satisfactory at 1920x1080, compelling you to scale back on eye candy (a shame, really). R7 260X won’t change your experience compared to the Radeon HD 7790. The thing is, most 7790s are 1 GB cards. The 2 GB Gigabyte model we bought sells for the same $140 AMD plans to charge for its R7 260X. So, for the same price, you’re getting a slight overclock and TrueAudio turned on. The good news for AMD is that, even after a price drop on Nvidia’s GeForce GTX 650 Ti Boost, its Bonaire-based boards are still a better value. Our Best Graphics Cards For The Money column concurs.


An analysis of frame rate over time at 1920x1080 and 2560x1440 breaks our 10 comparison boards into three distinct groups. Up top, the Tahiti-based offerings appear uncontested by the GeForce GTX 760, which instead competes against $200 Pitcairn-based cards.
Arma is taxing enough that, at 2560x1440, you’re probably going to want a Tahiti-class card. Otherwise, you’re going to spend a fair amount of time under 30 FPS.


In single-GPU configurations, all of these solutions demonstrate low frame time variance. For more on what this measurement includes and how we generate it, check out this page.
- Tahiti, Pitcairn, And Bonaire Show Up For An Encore
- R9 280X: The Tahiti GPU’s Second (Or Third?) Lease On Life
- R9 270X: Pitcairn Gets A Little Boost
- R7 260X: TrueAudio’s First Outing On The Back Of Bonaire
- TrueAudio: Dedicated Resources For Sound Processing
- Display Technology
- Test Setup And Software
- Results: Arma III
- Results: Battlefield 3
- Results: BioShock Infinite
- Results: Crysis 3
- Results: Grid 2
- Results: The Elder Scrolls V: Skyrim
- Results: Tomb Raider
- CAD: AutoCAD 2013 And Inventor 2013
- OpenGL: Maya 2013 And LightWave
- OpenCL: Bitmining, OpenCL, And RatGPU
- Power Consumption
- Clock Rate And Temperature
- Fan Speed And Noise
- Old GPUs Ride Again, But That’s Not A Bad Thing
I wrote one of the least flattering GTX 780 stories out there. I only identified a couple of situations where a Titan made any sense at all. And although the 760 *did* change the balance at $250, that card still didn't get an award. I liked the 770 for the simple fact that it delivered better-than-680 performance for close to $100 less.
The rest of AMD's new line-up is a lot like what exists already. Again, the 7870 is a better value than 270X. So what are you getting worked up over? The fact that I'm pointing out these aren't new GPUs? They're not.
Best to hold out till the reviews on the R9-290X I guess. But considering the specs I hope for at least 20% performance increases over a 7970.
The MSI R9 280X Gaming at $299 appears to outperform the GTX 770 at 1600P and is within margin of error at 1080P according to Techpowerup. Not a bad value at $100 less and still overclocks well:
http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/26.html
Best to hold out till the reviews on the R9-290X I guess. But considering the specs I hope for at least 20% performance increases over a 7970.
Are the days of (nearly) annual simultaneous full line GPU launches from $100-500 with a dual GPU chip to follow at $750-1000 really over?
I wrote one of the least flattering GTX 780 stories out there. I only identified a couple of situations where a Titan made any sense at all. And although the 760 *did* change the balance at $250, that card still didn't get an award. I liked the 770 for the simple fact that it delivered better-than-680 performance for close to $100 less.
The rest of AMD's new line-up is a lot like what exists already. Again, the 7870 is a better value than 270X. So what are you getting worked up over? The fact that I'm pointing out these aren't new GPUs? They're not.
That goes to you too Mr. NVIDIA
you won't want to. the 260 is more expensive, and you'll only get 1gb of it's memory in a xfire with a 7790. (in xfire/sli, the video memory is duplicated on both cards... not shared... so the total memory of the xfire/sli setup is equal to the smallest total mememory on each of the cards. so a 2gb + 1 gb gpu in xfire will have basically 1gb of vram for the xfire setup.
you won't want to. the 260 is more expensive, and you'll only get 1gb of it's memory in a xfire with a 7790. (in xfire/sli, the video memory is duplicated on both cards... not shared... so the total memory of the xfire/sli setup is equal to the smallest total mememory on each of the cards. so a 2gb + 1 gb gpu in xfire will have basically 1gb of vram for the xfire setup.
the true audio thing is still a mystery. We have to see if this thing really takes off or not. If this thing is at least has a just small success like physyx, I guess I wont mind shelling out just extra $10-20 for it.
the 7970/r9-280x is not competing in the 770's price bracket anymore. the 770 is 400 min... until that price comes down reviewing it against the 7970 would make as much sense as reviewing a 7950 against a gtx 650.
What retailer is doing this deal? I've been holding out to upgrade my 5850 for a while now and a pair of these would be a nice little (gigantic) upgrade
It feels like the price per pixel (in games at a given setting) has stayed the same for a while despite the increase in average display resolutions. Which would equate to gaming getting more and more expensive if you like to max the settings. I don't know if this is AMD/NVidia's fault or the game developers fault or both but it's kind of annoying.