AMD Radeon R9 280X, R9 270X, And R7 260X: Old GPUs, New Names

Results: Grid 2

Bigger numbers in Grid 2 mean that even mid-range cards serve up playable performance—so long as you match them up to high-end platforms with plenty of memory bandwidth. In this case, an overclocked Core i7-4960X and four channels of DDR3-1866 memory are what carry the Radeon HD 7870 and R9 270X to almost 50 FPS average rates at 2560x1440.

Tahiti justifies its price premium over the GeForce GTX 760’s GK104 at 2560x1440. The highest-end Nvidia card we’re testing, which sells for $250, barely slides in ahead of the Pitcairn-based boards. Again, Radeon HD 7870 for $180 looks like a pretty sweet deal for as long as it’s around, right?

At the bottom end, R7 260X comes in just ahead of the Radeon HD 7790, which matches its price. The GeForce GTX 650 Ti Boost, selling for $10 extra, does nothing extra for performance at 2560x1440. And its advantage at 1920x1080 isn’t significant enough to change the gaming experience.

Although performance through our Grid 2 benchmark run jumps up and down, creating fairly busy lines, we still see three clumps of cards. Unfortunately for Nvidia, its GeForce GTX 760 is part of the second clump where AMD’s cheaper Pitcairn-based cards show up.

Frame time variance is very low in Grid 2, even when we look at the worst-case 95th percentile numbers.

  • CaptainTom
    Wow what's with the AMD hate? As it stands they are doing the same thing Nvidia did except without the outrageous prices. The GTX 770 wasn't a great deal when the 7970 was $50 cheaper. Have fun trying to run BF3 with 2GB of VRAM...
  • slomo4sho
    Nothing revolutionary but better prices I suppose.

    The MSI R9 280X Gaming at $299 appears to outperform the GTX 770 at 1600P and is within margin of error at 1080P according to Techpowerup. Not a bad value at $100 less and still overclocks well:
  • jimmysmitty
    So long story short, if you have a HD7970GHz then these do nothing for you.

    Best to hold out till the reviews on the R9-290X I guess. But considering the specs I hope for at least 20% performance increases over a 7970.
  • Shankovich
    What happened to Chris? I didn't see this kind of hate with all of the 700 series rebrands. Also, to the Canadians here, grab the $270 7970 GHz edition cards while you still can.
  • BigMack70
    I don't like this new strategy AMD and Nvidia are taking of rebranding an old series at improved price points and then releasing only one new chip at a stupidly expensive price point.

    Are the days of (nearly) annual simultaneous full line GPU launches from $100-500 with a dual GPU chip to follow at $750-1000 really over?
  • cangelini
    Hate? The R9 280X won an *award*. I think Tahiti at $300 is pretty much brilliant.

    I wrote one of the least flattering GTX 780 stories out there. I only identified a couple of situations where a Titan made any sense at all. And although the 760 *did* change the balance at $250, that card still didn't get an award. I liked the 770 for the simple fact that it delivered better-than-680 performance for close to $100 less.

    The rest of AMD's new line-up is a lot like what exists already. Again, the 7870 is a better value than 270X. So what are you getting worked up over? The fact that I'm pointing out these aren't new GPUs? They're not. ;)
  • Shankovich
    Ok Chris, I agree with you, sorry for the over reaction. But I really don't like how nVidia made price increases for some of the rebrands. Looking forward to your 290 and 290X reviews :D
  • ingtar33
    i'll take a 7950 at $129 thank you very much (or two). There is a major retailer selling them for that this week. Best buy all year. two 7950s for the price of one r9-280x? yeah... i'll do that all day every day.
  • tomfreak
    Radeon 7790 has true Audio = but not enabled boooooo = as a 7790 owner I somewhat disappointed :( . Anyone have any idea if we can crossfire 1GB 7790 and 2GB 260x?
  • net_nakul
    By the time a R9 380X comes out, the GCN Tahiti XT achitecture may be 4 years old (assuming end of 2015). AMD better come up with an awesome new architecture by then, considering the R&D time they have.

    That goes to you too Mr. NVIDIA