Overclocking, Noise, Temperature & Power
Before I get into our overclocking experience, I need to mention that this isn't our first sample of PowerColor's Devil R9 390X. A pre-production card landed in our lab several months ago. Unfortunately, the early sample arrived with a pre-production BIOS that affected its clock rate (it wouldn't run at the advertised frequencies, causing display artifacts so severe that the test platform was unusable).
PowerColor sent a second sample that arrived with the same BIOS. But the company also provided an updated firmware to flash to the card. After writing that file to the board, we continued seeing significant display issues. PowerColor suggested that there could be a problem with the installed software, and recommended using Display Driver Uninstaller. After that utility removed all remnants of AMD's and Nvidia's driver packages, Catalyst 15.7.1 was reinstalled. Sadly, the distorted display remained.
While troubleshooting the issue, I determined that our DisplayPort cable was the problem's source. I use a Samsung U28D590D 4K display for testing, and for whatever reason, the bundled cable doesn't play nice with PowerColor's graphics card.
Overclocking
Nothing beyond the factory-stated specifications is ever guaranteed, but manufacturers don't make extreme cooling solutions without expecting their cards to be pushed to the limit. To that end, once we had a stable card, overclocking it was a priority. For science.
First, I ran 3DMark to get a baseline score for the Devil R9 390X. Strangely, this ended up being lower than the score I recorded from Sapphire's Nitro R9 390 earlier this summer. Next, I opened MSI Afterburner and maxed out the power limit. I ran another baseline test with the maxed-out power limit to make sure it didn't cause any instability. The results came back the same, and I observed no artifacts on-screen.
Using 10MHz increments, I found that the GPU core maxed out at 1160MHz. That wasn't a particularly large increase, but it was free performance nonetheless. Once artifacts started showing up, I dropped back to the default frequency to ensure no damage was done to the GPU. During the following test, I observed significant corruption, so I halted the benchmark immediately.
Upon verifying the settings in Afterburner, I discovered that resetting to default actually pushed the core voltage up 100mV. The pre-release BIOS that was sent to us seems to have a bug in it. Fortunately, this did not cause any permanent damage. Stability was restored by dropping the voltage. I found out later that re-seating the graphics card also fixes the voltage issue.
Further testing revealed that the highest stable clock rate remained 1160MHz. It's possible that increasing the voltage a little could help, but after what I had just gone through, I didn't want to push my luck with more juice.
The memory on our sample came clocked at 1525MHz. This particular board didn't take kindly to additional frequency; beyond 1545MHz, the graphics on-screen started to break up into green squares. Really, these weren't the results I was hoping for from such an aggressively-cooled card.
Noise
We typically report a noise measurement taken from the rear of the card at a distance of two inches. In this instance, however, I felt it necessary to show two readings, as the first one doesn't tell the whole story. It would be misleading to show only the noise levels directly behind a water-cooled GPU, especially when the majority of the sound comes from the radiator.
Readings from the back of the card fall within the same range as much of the competition, though, at idle, the Devil R9 390X's 12cm fan doesn't stop spinning like many other cards.
Measurements from the radiator are much louder. The fan pushes quite a bit of air flow, but it generates significant noise in the process. I took a reading next to the graphics card inside the case, and it was even louder. During game play, the decibel meter registered 62.5 dB in front of the fan on the card. From two feet away, I was still hearing 37 dB.
Temperature
There was no doubt that the R9 390X Devil would be among the coolest cards we've tested. The closed-loop cooler is more effective at dissipating the GPU's substantial heat than a conventional heat sink and fan.
The graph confirms our hypothesis; the closed-loop cooler does its job well. Even after 10 minutes of full load, the GPU temperature is only a few degrees warmer than its starting point. The R9 390X settled around 55 degrees, even after overclocking the card.
Power Consumption
It's no surprise that AMD GPUs are power-hungry. PowerColor's Devil R9 390X uses one of the company's highest-end processors and overclocks it. There's also a pump in the loop, which requires power as well. Clearly, this was never meant to be an efficient graphics card.
And there it is; the R9 390X tops the chart. Peak power draw in the torture test isn't as high as the Fury, but under a more realistic gaming load, the R9 390X uses almost 20W more. Even the card's idle power draw is 10W higher than the R9 390. When the overclock is applied, the spread is even larger.
MORE: Best Graphics CardsMORE: All Graphics Content