PowerColor Devil R9 390X Review

Gaming Benchmarks

Battlefield 4

Battlefield 4 is several years old, but it's still a great workload for even the most powerful graphics cards. PowerColor's Devil R9 390X, with its aggressive overclock, fares well against the competition.

In stock form, PowerColor's card performs as expected. It's slightly slower than the Fury X, and considerably faster than the Radeon R9 390. This game is one where Nvidia's GeForce GTX 970 outperforms the R9 390, matching the Devil at its factory clock rates.

QHD is where the Radeon R9 390X really shines, though. The R9 390X Devil performs well here, keeping pace with Nvidia's GeForce GTX 980 and the Fury Tri-X. If your native resolution is 2560x1440, you have to love a graphics card averaging right around 60 FPS.

Although 4K gets a lot of attention for its ability to punish even high-end hardware, it's still fairly rare (under .1%, according to Steam's most recent hardware survey), likely owing to the resolution's requirements for fluid game play. Despite the R9 390X Devil's 8GB of memory and modest overclock, a Grenada GPU isn't fast enough to drive 4K on its own. You'd need a faster processor or multiple cards in CrossFire.

Far Cry 4

Far Cry 4 is much newer, and it is written using some of Nvidia's GameWorks technologies. You'd naturally expect it to favor that company's hardware. But Radeon owners still play the game, so we need to know what to expect.

The Devil R9 390X performs well at 1920x1080, averaging around 2 FPS less than Sapphire's Fury Tri-X. After overclocking, we even got the Devil card to lead.

Nvidia's GeForce GTX 970 averages roughly the same performance, but achieves a much higher minimum frame rate, staying above 60 FPS at all times.

As we increase the resolution, Nvidia's boards start to slip. Meanwhile, the Devil manages 66 FPS in its stock form and 72 FPS after overclocking. The Fury Tri-X is marginally faster, managing 12 more frames per second at the top end.

The Devil R9 390X fares well in Far Cry 4 at 4K. At no point does the frame rate dip below 30, and it's usually in the mid- to high-30s. Those numbers aren't great, but at 3840x2160, it's what we've come to expect. Still, you'd  be happier with a faster card or multiple GPUs rendering cooperatively.

Grand Theft Auto V

GTA V is notorious for utilizing a lot of graphics memory. If any title is going to benefit from the 390X's 8GB, this should be it.

In stock form, PowerColor's R9 390X Devil manages a significant lead over the Fury, averaging approximately 10 FPS more. It goes without saying that the card we're reviewing handles these settings with ease at 1080p.

With the resolution cranked up to 1440p, the gap between PowerColor's Devil R9 390X and Sapphire's R9 Fury Tri-X disappears. While the Devil continues to lead, the difference is negligible. The Fiji GPU and its 4GB of HBM are not yet a bottleneck.

At 3840x2160, HBM's superior bandwidth allows the Fury to extend its advantage of the Devil R9 390X. More capacity doesn't seem to help the Grenada GPU. Regardless, the minimum frame rates from both cards are too low for this to be a viable setup. You'd want to scale back on the detail settings or add hardware.

Metro: Last Light

Running Metro: Last Light at 1080p yields well over 100 FPS from the entire field.

For gamers playing on QHD displays, the Devil R9 390X is an excellent option. We measured frame rates in the low 70s, with minimums that remain north of 40. The Fury card enjoys a sizable lead over PowerColor's card, while Nvidia's GeForce GTX 980 trails by nearly 8 FPS.

The Devil R9 390X doesn't do as well at 4K compared to the Fury, which isn't constrained by its 4GB of HBM, and seems to really benefit from its extra shading resources. Interestingly, the GeForce GTX 980 and 780 Ti are in the same league as PowerColor's board.

Middle-earth: Shadow of Mordor

In stock form, PowerColor's Devil R9 390X handles 1920x1080 as well as Sapphire's Fury Tri-X; overclocking pushes it ahead of the Fiji-based board.

Running at 2560x1440, the Devil loses its edge over the Fury, which assumes the lead. Still, the Devil R9 390X is best suited to this resolution. You pay less money for playable frame rates at QHD.

Another resolution bump takes us to 4K, where the Devil catches back up to Sapphire's Fury. Prior to overclocking, the Devil R9 390X lands within one frame per second of the HBM-equipped board on average. After tuning PowerColor's card, they're almost indistinguishable.

Tomb Raider

As you can see from the graphs, our short benchmark run in Tomb Raider features highly variable frame rates. PowerColor's Devil R9 390X outpaces the Fury by an average of 4 FPS, though the Fury doesn't dip as low as the Devil. 

At 2560x1440, the difference between 390X and Fury closes (though overclocked settings confer an advantage to PowerColor's card). With an average frame rate in the 70s, Tomb Raider can easily be considered playable, despite minimums that drop into the 30s.

If you want a decent frame rate at 4K, you'll need to lower this game's detail settings. Though the Devil R9 390X manages to pass the Fury, neither card touches 60 FPS on the high side. Averages approaching 40 aren't bad, but those minimums are jarring, to be sure.

MORE: Best Graphics Cards
MORE: All Graphics Content

This thread is closed for comments
22 comments
    Your comment
  • utroz
    Hmm.. So a pre almost max OCed card with watercooling.. Only around 6 months late (or more if you count the 290X 8GB as basically the same as a 390X). At this point if you have a decent card wait for 16nm..
  • BrandonYoung
    An impressive result by AMD and PowerColor! I'm looking forward to future (more modern) released by these companies hoping to bring more competition into the once stagnant GPU realm!

    I'm aware this is a review of the Devil R9, yet I'm curious why the GTX 980 was mentioned in the noise graph, but omitted in the temperature graph, I get the feeling its because it will show the card was throttling based on thermals, helping describe its performance in the earlier tests, this is strictly speculation on my behalf however, and highly bias as I currently own a 980.
  • ryguystye
    Does the pump constantly run? I wish there was a hybrid liquid/air cooler that ran the fan only when idle, then turned on the water pump for more intensive tasks. I don't like the noise of water pumps when the rest of my system is idle
  • fil1p
    Quote:
    Does the pump constantly run? I wish there was a hybrid liquid/air cooler that ran the fan only when idle, then turned on the water pump for more intensive tasks. I don't like the noise of water pumps when the rest of my system is idle


    The pump has to run, even at low RPMs, otherwise the card would overheat. The waterblock itself is generally not enough to dissipate heat. The waterblock simply transfers the heat to the water and the radiator does almost all of the heat dissipation. If the pump is off there is no water flow through the radiator, no water flow will mean heat from the waterblock is not dissipated, causing the water in the waterblock to heat up and the GPU to overheat.
  • elho_cid
    What's the point testing on the windows 8.1? I mean, there was enough time to upgrade to windows 10 already... It was shown several times that the new W10 often provide measurable performance advantage.
  • Sakkura
    To be fair, overclocking headroom varies from GPU to GPU. Maybe you just got a dud, and other cards will overclock better.
  • Cryio
    Why test on 15.7? Seriously, that's like 6 drivers old. AMD stated they will release WHQL just ocasionaly, with more Beta throughout the year. You're doing them a diservice benching only "official" drivers.

    Nvidia's latest ... dunno, 12 drivers in the last 3 months were all official and most of them broke games or destroyed performance in a lot of other games.
  • kcarbotte
    Quote:
    Why test on 15.7? Seriously, that's like 6 drivers old. AMD stated they will release WHQL just ocasionaly, with more Beta throughout the year. You're doing them a diservice benching only "official" drivers. Nvidia's latest ... dunno, 12 drivers in the last 3 months were all official and most of them broke games or destroyed performance in a lot of other games.


    At the time this review was written it was not that old. As mentioned in the article, we first got this card over the summer. The test were done a couple months ago now and at the time they were done with the driver that Power Color suggested after having problems with the first sample.

    Quote:
    An impressive result by AMD and PowerColor! I'm looking forward to future (more modern) released by these companies hoping to bring more competition into the once stagnant GPU realm! I'm aware this is a review of the Devil R9, yet I'm curious why the GTX 980 was mentioned in the noise graph, but omitted in the temperature graph, I get the feeling its because it will show the card was throttling based on thermals, helping describe its performance in the earlier tests, this is strictly speculation on my behalf however, and highly bias as I currently own a 980.


    The temperature of the 980 was omitted because the ambiant temperature of the room was 3 degrees cooler when that card was tested, which affected the results. I didn't have the GTX 980 in the lab to redo the tests with the new sample. I had the card when the defective 390x arrived for the roundup, but when the replacement came back it was loaned to another lab at the time.
    Rather than delay the review even longer, I opted to omit the 980 from the test.

    It had nothing to do with hiding any kind of throttling result. If that were found we wouldn't slip it under the rug.

    Quote:
    What's the point testing on the windows 8.1? I mean, there was enough time to upgrade to windows 10 already... It was shown several times that the new W10 often provide measurable performance advantage.


    We have not made the switch to Windows 10 on any of our test benches yet. I don't make the call about when that happens and I don't know the reasons behind the delay.
  • Cryio
    Well then, sir @kcarbotte, I can't wait until you guys get to review some AMD GPUs on Windows 10 with the new Crimson drivers and some Skylake i7s thrown into the mix !
  • kcarbotte
    414569 said:
    Well then, sir @kcarbotte, I can't wait until you guys get to review some AMD GPUs on Windows 10 with the new Crimson drivers and some Skylake i7s thrown into the mix !


    You and me both!
    I have a feeling that the Crimson drivers have better gains in Win10 than the do in older OS's.
  • monsta
    Quote:
    An impressive result by AMD and PowerColor! I'm looking forward to future (more modern) released by these companies hoping to bring more competition into the once stagnant GPU realm! I'm aware this is a review of the Devil R9, yet I'm curious why the GTX 980 was mentioned in the noise graph, but omitted in the temperature graph, I get the feeling its because it will show the card was throttling based on thermals, helping describe its performance in the earlier tests, this is strictly speculation on my behalf however, and highly bias as I currently own a 980.

    *Face palm*
    How is this impressive? The reviewer could not recommend it and said there are better options available LOL
  • TJClark
    UNTIL SOME KIND OF DRASTIC CHANGE COMES ABOUT.......
    WHEN THE PRICE OF HIGHEST DEF MONITORS COME DOWN TO REALITY......
    I'm happier than a pig in poop because the price per performance gains are MINISCULE......MY XFX R290 / LG 27 inch 1920 X 1080 work beautifully, thank you....VSR makes my display explode with reality ! ! !
    p.s. T.V. 's are cheap....When a monitor is classified as a PC device it's price soars.
    Are we really that stupid ? A HIGH DEF, 4K TV is 100's of dollars cheaper than a " computer monitor " That pertains to ALL definitions and widths.....
  • blazorthon
    2130723 said:
    UNTIL SOME KIND OF DRASTIC CHANGE COMES ABOUT....... WHEN THE PRICE OF HIGHEST DEF MONITORS COME DOWN TO REALITY...... I'm happier than a pig in poop because the price per performance gains are MINISCULE......MY XFX R290 / LG 27 inch 1920 X 1080 work beautifully, thank you....VSR makes my display explode with reality ! ! ! p.s. T.V. 's are cheap....When a monitor is classified as a PC device it's price soars. Are we really that stupid ? A HIGH DEF, 4K TV is 100's of dollars cheaper than a " computer monitor " That pertains to ALL definitions and widths.....


    Are you joking? Neweggs cheapest 4K monitor is $300 and their cheapest 4K TV is $250. Furthermore, many of the 4K monitors are only around $350 whereas the next cheapest 4K TVs are over $400.

    Also, labeling a TV as a computer monitor does not make it the same as a computer monitor. Monitors are generally made to have lower response time and there are other differences like that.
  • BrandonYoung
    135495 said:
    Quote:
    An impressive result by AMD and PowerColor! I'm looking forward to future (more modern) released by these companies hoping to bring more competition into the once stagnant GPU realm! I'm aware this is a review of the Devil R9, yet I'm curious why the GTX 980 was mentioned in the noise graph, but omitted in the temperature graph, I get the feeling its because it will show the card was throttling based on thermals, helping describe its performance in the earlier tests, this is strictly speculation on my behalf however, and highly bias as I currently own a 980.
    *Face palm* How is this impressive? The reviewer could not recommend it and said there are better options available LOL


    Because, it directly competes with Nvidia's higher-end products, which is nice to see once more. Competition promotes advancement which is good for everyone.

    Without competition, advancement slows to a crawl (see Intel).
  • BrandonYoung
    412399 said:
    2130723 said:
    UNTIL SOME KIND OF DRASTIC CHANGE COMES ABOUT....... WHEN THE PRICE OF HIGHEST DEF MONITORS COME DOWN TO REALITY...... I'm happier than a pig in poop because the price per performance gains are MINISCULE......MY XFX R290 / LG 27 inch 1920 X 1080 work beautifully, thank you....VSR makes my display explode with reality ! ! ! p.s. T.V. 's are cheap....When a monitor is classified as a PC device it's price soars. Are we really that stupid ? A HIGH DEF, 4K TV is 100's of dollars cheaper than a " computer monitor " That pertains to ALL definitions and widths.....
    Are you joking? Neweggs cheapest 4K monitor is $300 and their cheapest 4K TV is $250. Furthermore, many of the 4K monitors are only around $350 whereas the next cheapest 4K TVs are over $400. Also, labeling a TV as a computer monitor does not make it the same as a computer monitor. Monitors are generally made to have lower response time and there are other differences like that.


    Not to mention most 4k TV's lack DisplayPort or HDMI 2.0 inputs which enable refresh rates above 30Hz at 4k resolutions. No one wants to use a 30Hz 4k TV as a monitor.
  • wayfarer1
    After spending 400 on a 390x and not getting it to work AMD is done! As far as I'm concerned AMD doesn't exist anymore.
  • TJClark
    1454811 said:
    After spending 400 on a 390x and not getting it to work AMD is done! As far as I'm concerned AMD doesn't exist anymore.
  • TJClark
    Sorry to hear that......
  • cobra5000
    Kind of sad that you don't know how to install a video card, wayfarer. Sounds like it was a p.i.c.n.i.c.
  • m16shooter
    All I can say, is that I'm running two of these in cross fire and couldn't be more happier. It plays every game I throw at it at with Ultra settings on my free sync 2k screen. I'm NOT overclocking them because to me, the extra few frames per second won't be noticeable when I'm already at 100+.
  • TbsToy
    The ole gaming graphics $ chase thing by the marketers to transfer yer dollars to their accounts again!
    Walt Prill
  • Tzn
    Again AMD power consumption is a big no no.