AMD Radeon R9 380X Nitro Launch Review

FHD (1920x1080) Gaming Results

AMD positions the Radeon R9 380X as a QHD/1440p graphics card. However, if we’re honest, most buyers will probably use it to drive the still-popular FHD resolution and happily max-out their detail settings.

The Witcher 3: Wild Hunt

The difference between AMD's Radeon R9 380 and 380X is only about seven percent, which really isn’t that much. Both graphics cards average close to 40 FPS, and it's really hard to tell them apart subjectively in the real world. Both cards would, however, benefit from slightly lower detail settings.

Grand Theft Auto V

New game, same results? Almost, but not quite; the new graphics card’s lead shrinks to five percent. Both cards do manage to produce playable frame rates, even though most enthusiasts would prefer significantly better performance. You'd want to dial back graphics quality to achieve this.

Metro: Last Light

This title is also a classic hardware benchmark, since it heavily features tessellation. AMD’s latest pulls ahead, managing a more comfortable nine percent lead. It’s interesting to see that the GeForce GTX 960, purportedly the weakest card on paper, manages to secure a position in between AMD’s two contenders.

Bioshock Infinite

Now what? AMD’s Radeon R9 380X beats its smaller sibling by six percent. If you think that's subtle, just wait until you actually play the games we're testing. You won’t notice the difference at all.

Tomb Raider

Tomb Raider is one of AMD’s flagship titles. Both of the company's primary contenders fare well in it. For the first time, the Radeon R9 380X manages a double-digit lead over the X-less 380 (a full 10 percent). However, this difference is still barely noticeable when you sit down to play. Elsewhere, AMD's Radeon R9 390 plays in a league of its own, and Nvidia’s GeForce GTX 960 is left in the other cards’ dust.

Battlefield 4 (Campaign)

Battlefield 4 has seen many patches, and the drivers should be perfectly optimized for it. Consequently, this game is still worth a look. Nvidia’s GeForce GTX 960 manages to beat AMD’s Radeon R9 380 and is, in turn, beaten by the 380X. Again, a six percent difference is of no consequence during a subjective gaming comparison, though.

Middle-Earth: Shadow of Mordor

We finally get a look at what happens when Tonga faces a real challenge and the driver does its part. AMD’s newest card pulls ahead of its stablemate by 16 percent, yielding our first impressive result. Nvidia’s GeForce GTX 960 can’t keep up with either of its main competitors.

Thief

Things get more challenging again, with power consumption going up to the level we saw running Metro: Last Light. When the dust settles, AMD’s Radeon R9 380X comes out ahead by eight percent. Once again, this is barely noticeable in a real-world situation. The frame-time curves are very similar, after all.

Ashes of the Singularity

Since there are really no mature, or even finished DirectX 12 games on the market, we had to go with the pre-beta build of Ashes of the Singularity. Consequently, consider the results subject to change. There's just not enough optimization in place yet. At least we'll get some idea of where performance may stand in the future. Ark: Survival Evolved would have been nice to test, but because its DirectX 12 patch kept getting pushed back, we had to skip it.

The individual frames' render times from the different views are interesting. The total rendering time is congruent with how demanding the benchmark scenes are.

We programmed our own interpreter that automatically analyses the log files and gives us the number of CPU calls and a ratio of the frames that were actually rendered.

Bottom Line

With few exceptions, the R9 380X Nitro provides a good gaming experience at FHD using the highest settings. Enthusiasts who prefer high frame rates at or above their monitor’s native refresh rate need to dial down the quality settings, though.

Really, there's not much difference between AMD’s Radeon R9 380X and the older X-less version when it comes to real-world gaming. Both of the cards we tested came overclocked from the factory and didn’t offer much room for further tuning, so this is really all of the performance you'll get from them.

Fortunately for AMD, Nvidia doesn’t really have a competing product in this category. The GeForce GTX 960 is just too slow, and the 970 is significantly more expensive. AMD’s new Radeon R9 380X is positioned right in the middle of that gap, whereas the 380 is a bit closer to Nvidia’s GeForce GTX 960. Its price is competitive with the GeForce as well, though.

This thread is closed for comments
95 comments
    Your comment
  • ingtar33
    so full tonga, release date 2015; matches full tahiti, release date 2011.

    so why did they retire tahiti 7970/280x for this? 3 generations of gpus with the same rough number scheme and same performance is sorta sad.
  • wh3resmycar
    this has all the features, freesync, trueaudio, etc.
  • logainofhades
    Features, and power ratings are lower. Tonga is 190w, vs 250w for Tahiti.
  • Eggz
    Seems underwhelming until you read the price. Pretty good for only $230! It's not that much slower than the 970, but it's still about $60 cheaper. Well placed.
  • chaosmassive
    been waiting for this card review, I saw photographer fingers on silicon reflection btw !
  • Onus
    Once again, it appears that the relevance of a card is determined by its price (i.e. price/performance, not just performance). There are no bad cards, only bad prices. That it needs two 6-pin PCIe power connections rather than the 8-pin plus 6-pin needed by the HD7970 is, however, a step in the right direction.
  • FormatC
    Quote:
    I saw photographer fingers on silicon


    I know, this are my fingers and my wedding ring. :P
    Call it a unique watermark. ;)
  • psycher1
    Honestly I'm getting a bit tired of people getting so over-enthusiastic about which resolutions their cards can handle. I barely think the 970 is good enough for 1080p.

    With my 2560*1080 (to be fair, 33% more pixels) panel and a 970, I can almost never pull off ultimate graphic settings out of modern games, with the Witcher 3 only averaging about 35fps while at medium-high according to GeForce Experience.

    If this is already the case, give it a year or two. Future proofing does not mean you should need to consider sli after only 6 months and a minor display upgrade.
  • Eggz
    1272112 said:
    Honestly I'm getting a bit tired of people getting so over-enthusiastic about which resolutions their cards can handle. I barely think the 970 is good enough for 1080p. With my 2560*1080 (to be fair, 33% more pixels) panel and a 970, I can almost never pull off ultimate graphic settings out of modern games, with the Witcher 3 only averaging about 35fps while at medium-high according to GeForce Experience. If this is already the case, give it a year or two. Future proofing does not mean you should need to consider sli after only 6 months and a minor display upgrade.


    Yeah, I definitely think that the 980 ti, Titan X, FuryX, and Fury Nano are the first cards that adequately exceed 1080p. No cards before those really allow the user to forget about graphics bottlenecks at a higher standard resolution. But even with those, 1440p is about the most you can do when your standards are that high. I consider my 780 ti almost perfect for 1080p, though it does bottleneck here and there in 1080p games. Using 4K without graphics bottlenecks is a lot further out than people realize.
  • ByteManiak
    everyone is playing GTA V and Witcher 3 in 4K at 30 fps and i'm just sitting here struggling to get a TNT2 to run Descent 3 at 60 fps in 800x600 on a Pentium 3 machine
  • blazorthon
    Quote:
    Yeah, I definitely think that the 980 ti, Titan X, FuryX, and Fury Nano are the first cards that adequately exceed 1080p. No cards before those really allow the user to forget about graphics bottlenecks at a higher standard resolution. But even with those, 1440p is about the most you can do when your standards are that high. I consider my 780 ti almost perfect for 1080p, though it does bottleneck here and there in 1080p games. Using 4K without graphics bottlenecks is a lot further out than people realize.


    The games we have today also happen to be the first games that pushed cards like the 7970 and the 680 to no longer be adequate for 1440p, let alone 1080p with settings maxed out. It's a pointless chicken and the egg argument.
  • ErikVinoya
    With those temps and the backplate, I think with a bit of oil, I can make breakfast while running furmark on that
  • blazorthon
    1027081 said:
    so full tonga, release date 2015; matches full tahiti, release date 2011. so why did they retire tahiti 7970/280x for this? 3 generations of gpus with the same rough number scheme and same performance is sorta sad.


    I fail to see how lower power consumption, new features, higher memory capacity, and lower prices (both to manufacture and for us to buy) are sad. It isn't as if the 380X is now the high end. It is just a mid-ranged card that is replacing a formerly high-end card. This happens every graphics card generation. GTX 960 and GTX 580, Radeon 7850 and Radeon 5870, the list goes on and on. Furthermore, the 380X has a considerable performance advantage over the 7970 despite having the same core count because core count is far from everything and many other things changed from Tahiti to Tonga/Antigua. The 380X is easily about 20% faster than the 7970.
  • eklipz330
    your perception of what is enough for 1080p may differ from the author's. think about it. my r9 290 could handle most games at 1080p above 60fps, and many of them above 100fps. that is a VERY capable card at 1080p, and probably more than enough for the average gamer. not everyone has the itch to max every setting out; i personally like lowering the resolution and increasing AA.
  • Eggz
    412399 said:
    The games we have today also happen to be the first games that pushed cards like the 7970 and the 680 to no longer be adequate for 1440p, let alone 1080p with settings maxed out. It's a pointless chicken and the egg argument.


    That's probably true to a certain extent, but I'm not sure I entirely agree. You'll be able to run today's most demanding games, and future games, with max setting at low resolutions on current- and past-gen cards. If you take the chicken and egg idea to its logical end, then there would eventually be a game that not even the most powerful card could not run at even at the lowest resolution possible. But that's not what happens. With a good CPU, there are cards that will be able to run anything at (for example) 640x480, no matter what settings, and no matter which game.

    There's just an upper limit to processing requirements based on resolution and a target frame rate. Games don't usually reach that upper limit, but there is one. Ray tracing might be the best real-world approximation of demanding the most out of a certain resolution and frame rate. Whatever that limit is, though, there will be a card that can handle it for lower resolutions. If that weren't true, then someone would be able to build a color switching app for one (1) pixel, a resolution of 1x1, that could bring down the Titan X to less than 60 fps at all times. I don't think that would be possible with proper coding applied.

    That plays out in the real world by certain cards being able to handle any game, at any setting, at a certain resolution while maintaining a targeted frame rate. I'd bet $1,000 that a resolution of 16x9 would be incapable of bringing the frame rate below 60 fps for any game to exist in the foreseeable future using a flagship graphics card (setting aside incompatibilities and non-graphics bottlenecks).

    I could be totally wrong about there being an upper limit on processing requirements within a bound resolution and minimum frame rate, and I'm open to being shown why that upper limit theory isn't true. But I'm inclined to think it's true until I see exactly where I'm off.
  • red77star
    My HD7970 Crossfire setup still going strong. Every single game maxed on 1080p
  • spentshells
    I am dissapoint, I was expecting the full 384 mem bus and substantially higher performance. I had been hyping this for a while as the reason for NV to release the 960 ti's and it most certainly won't be.
  • blazorthon
    1406980 said:
    That's probably true to a certain extent, but I'm not sure I entirely agree. You'll be able to run today's most demanding games, and future games, with max setting at low resolutions on current- and past-gen cards. If you take the chicken and egg idea to its logical end, then there would eventually be a game that not even the most powerful card could not run at even at the lowest resolution possible. But that's not what happens. With a good CPU, there are cards that will be able to run anything at (for example) 640x480, no matter what settings, and no matter which game. There's just an upper limit to processing requirements based on resolution and a target frame rate. Games don't usually reach that upper limit, but there is one. Ray tracing might be the best real-world approximation of demanding the most out of a certain resolution and frame rate. Whatever that limit is, though, there will be a card that can handle it for lower resolutions. If that weren't true, then someone would be able to build a color switching app for one (1) pixel, a resolution of 1x1, that could bring down the Titan X to less than 60 fps at all times. I don't think that would be possible with proper coding applied. That plays out in the real world by certain cards being able to handle any game, at any setting, at a certain resolution while maintaining a targeted frame rate. I'd bet $1,000 that a resolution of 16x9 would be incapable of bringing the frame rate below 60 fps for any game to exist in the foreseeable future using a flagship graphics card (setting aside incompatibilities and non-graphics bottlenecks). I could be totally wrong about there being an upper limit on processing requirements within a bound resolution and minimum frame rate, and I'm open to being shown why that upper limit theory isn't true. But I'm inclined to think it's true until I see exactly where I'm off.


    Quote:
    Yeah, I definitely think that the 980 ti, Titan X, FuryX, and Fury Nano are the first cards that adequately exceed 1080p. No cards before those really allow the user to forget about graphics bottlenecks at a higher standard resolution. But even with those, 1440p is about the most you can do when your standards are that high. I consider my 780 ti almost perfect for 1080p, though it does bottleneck here and there in 1080p games. Using 4K without graphics bottlenecks is a lot further out than people realize.


    Go back a few years, say to when the 7970 GHz edition launched so we have pretty much all of that generation's cards out and most driver kinks were worked out on both sides. The 7970 cards and the 680 were capable of running pretty much all games in 1440p in ultra and the 7850/650 Ti Boost could run pretty much all games in 1080p in ultra, all assuming there wasn't another bottleneck (such as a weak CPU).

    So, the current high end cards were not the first cards to be adequate for 1440p, let alone 1080p. Today's top end cards are merely the first to be able to run today's most intensive games in their heavier settings at 1080p and 1440p. This isn't just true to some extent because the older reviews prove it (have a look at some Radeon 7970 GHz Edition reviews for a good lineup). This is what I meant by chicken and the egg. Current top-end cards don't let you "forget about graphics bottlenecks at a higher standard resolution [1440p]," anymore than top end cards from previous generations did with corresponding games.

    I don't see how low resolutions, settings, and any performance limits related to them and CPUs are relevant to this.

    Anything with a resolution of one pixel might be useless for this topic. I don't think things like tessellation, AA, AF, and so on can be performed on a resolution of one pixel. In that case, yes, there might be a limit to how much performance can be necessary before it becomes arbitrary, but I say that out of not knowing what modern features can really do with only one pixel. With a real resolution, let's use a low one like 640x480, you can always apply more and more settings, filters, etc. until you bog down even Titan X. Whether or not a game happens to support applying enough of these to such a resolution is another matter, but it can be done if some dev wanted to make such a program. The real limit is at what point you're throwing more and more resources at the problem without a discernible improvement in visual quality. For example, we could make a game that runs at 640x480 with 1024x MSAA or something like that, but why bother when you won't see any benefit from a minuscule fraction of that?
  • psycher1
    So we're at least semi agreeing that, limiting the discussion to modern titles and looking forward, cards like the title articles 380x are most definitely NOT 1440p cards, and at best are 1080p cards that are already staring that limit in the face as well.

    With the 680 being talked about in the past tense so soon, I wonder if I didn't even underestimate how long my 970 will last.

    Anybody here think that dx12 and other such software improvements will help keep these cards relevant? Or are we still looking at a ~3 year upgrade cycle to just run games at decent settings, even with mostly top-of-the-line hardware today?
  • turkey3_scratch
    I don't think this card cuts it with the $230-240 price tag, when a 380 can be found for $170. That is 35% more price for about 8% more performance. However, I would not be surprised if the 380X slowly goes down to below $200 and knocks the 380 out of sale. If the 380X was piced at $200 then it would be a good deal.

    Sure, you can compare it to the fact that the 380 cost about what $230 or so when it was released, but the 380 has been around a good deal of time and the prices have dropped significantly.

    Also, why on earth are they using a Powercolor 390 and an MSI 970 rather than using the MSI 390?
  • TbsToy
    Oh boy, another graphics card that comes to save the day for a non question and a nonexistent need. Uhhhh, how much does it cost again?
    Walt Prill
  • turkey3_scratch
    All graphics cards are usually bad value when they're released. It's just like any product in this world. Give it 2 months and this card will be competing.
  • monsta
    another over hyped release from AMD with lack lustre performance upgrades, this is embarrassing
  • eodeo
    You would do well to look how sleek Tech power up's charts look like and how neatly per game and overall they are organized.

    More to the point of this post- you say that 380x uses virtualy the same idle power with two monitors, while TPU says its typical AMD crapscore of what it used to be- 3-4x more than it should and than you say it is. Did they make a mistake or did you?

    http://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/21.html

    Also, while here why don't you still mention how poorly AMD video viewing power draw is? Along with the multi monitor power draw, video playback has been a huge negative on AMD for the past 3 years- ever since nvidia decided to fix it in 2012 on all their dx11 GPUs via drivers...