AMD Radeon R9 Fury Review: Sapphire Tri-X Overclocked

Power Consumption And Efficiency

We’re using the same benchmark system that we used for our AMD Radeon R9 Fury X launch article. However, we’ve tweaked it a bit. The idle power consumption’s now being measured using a system that has a lot of the software on it that tends to accumulate over time. Last time, the system was completely “fresh.” This time around, linear interpolation is applied to the data by the oscilloscope, and not when it is analyzed.

The biggest change, which we’ll stick with for all future launch articles, concerns the content, though.

We’ll look at power consumption in direct relation to gaming performance, and we’ll do so separately for 1920x1080 and 3840x2160 since there are major differences between the two. We’ll also look at several different games, and even run some of them with different settings, such as tessellation. In addition, we’ve added some applications that aren’t related to gaming. Life’s not always just about gaming, after all.

Bear in mind that these tests are merely snapshots in specific applications. A benchmark not part of our suite will likely fall somewhere between our average figure and the torture test peak. There are no absolutes these days; best estimates will always consist of a range.

Idle Power Consumption

At 16W, the partner card’s power consumption is the same as the original Radeon R9 Fury X. Also, nothing changes if a TV with a different refresh rate and a 4K monitor with a different resolution are connected. This is nice to see.

Ultra HD (3840x2160) Using Default And Unlocked BIOS

Having two BIOSes is nice, but they do practically nothing for performance. We’ll see later that the frequencies stay the same no matter which firmware you use. The unlocked BIOS just allows slightly higher average voltages over longer time periods, resulting in marginal frame rate differences.

Compared to the (slightly) slower GeForce GTX 980 with its average power consumption of 160W, the default BIOS’ 254W and unlocked BIOS’ 259W look absolutely massive.

The following two picture galleries can be used to flip back and forth between watt per FPS (Picture 1), power consumption (Picture 2) and benchmark results (Picture 3) for all games and settings we tested. We’ll first take a look at the default BIOS:

The unlocked BIOS’ only contributions are worse efficiency and a hotter graphics card:

Full HD (1920x1080) Using Default BIOS

We’re exclusively using the default BIOS due to this resolution's limits. Spot tests showed an increase of just one to three watts after switching to the unlocked BIOS. The higher power limit doesn’t impact performance at all.

Once again, and as we already know, the GeForce GTX 980 is a bit slower, but uses a lot less power and is consequently more efficient.

Efficiency For Different Applications

Let's shift away from gaming for a moment. Four tests show that professional software needs to be measured on a title by title basis because driver optimization, and with it application performance, can vary wildly. Generalizing just isn’t possible. We’re still offering a quick performance snapshot here.

We set the Sapphire R9 Fury Tri-X’s performance as 100 percent, since the different benchmarks have dissimilar scoring systems.

Not taking performance into account, the internal power consumption duel can be summarized like this:

So What Does The Unlocked BIOS Really Get You?

If we were mean, we’d answer the above question with "basically nothing." The second firmware is intended for overclockers, but its inclusion is questionable for a graphics card with so little frequency headroom. Other than the marginally higher power consumption during gaming, there’s barely any difference to be found between the two BIOSes. However, the graph of the voltage results for the stress test shows the substantially smaller intervals at higher voltage.

These are what increase the power consumption slightly, even if the bumps have no measurable impact on gaming performance.

It’s hard to draw a definitive conclusion about the Sapphire R9 Fury Tri-X knowing that AMD says upcoming drivers will tease more performance out of it. However, it is clear that, due to the higher leakage current, the air-cooled partner card is less efficient than the liquid-cooled Radeon R9 Fury X. Its power consumption is significantly higher than that of comparable Nvidia graphics cards.

This thread is closed for comments
108 comments
    Your comment
  • Troezar
    Some good news for AMD. A bonus for Nvidia users too, more competition equals better prices for us all.
  • AndrewJacksonZA
    Kevin, Igor, thank you for the review. Now the question people might want to ask themselves is, is the $80-$100 extra for the Fury X worth it? :-)
  • bjaminnyc
    My next card, nice job AMD.
  • vertexx
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!
  • eza
    fyi - typos in verdict: should be "has proven" and "fewer texture units"
  • ern88
    I would like to get this card. But I am currently playing at 1080p, but will probably got to 1440p soon!!!!
  • confus3d
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.
  • rohitbaran
    This is my next card for certain. Fury X is a bit too expensive for my taste. With driver updates, I think the results will get better.
  • Larry Litmanen
    Quote:
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.


    I was in microcenter the other day, one of the very few places you can actually see a 4K display physically. I have to say i wasn't impressed, everything looked small, it just looks like they shrunk the images on PC.

    Maybe it was just that monitor but it did not look special to the point where i would spend $500 on monitor and $650 for a new GPU.
  • Embra
    I hope you can add the 5.7 driver results.
  • Innocent_Bystander
    that's the one right there... Now if I could get off my *ss and actually upgrade my CPU from a Phenom II X4 965... :)
  • ern88
    181311 said:
    I hope you can add the 15.7 driver results.


    Here is your answer here:

    http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69792-amd-r9-fury-performance-review-20.html
  • cknobman
    This looks like the Fury to get, promising results IMO.
  • Vlad Rose
    Wow, no wonder Nvidia released the 980ti. I was a bit surprised to see these benchmarks with the Fury cards beating up the 980 like they are. Nvidia really blew the wind out of AMD sails with their ti.
  • kcarbotte
    Quote:
    fyi - typos in verdict: should be "has proven" and "fewer texture units"

    Very late night writing. I'm impressed that's all there was actually.

    Quote:
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.


    This is going to be relative to the game, and to the users perception. Medium settings will have lower quality shadows and lighting, both of which are not improved just be resolution alone.
    Some games will look great, and most will likely play excellent, but do you really want to spend $550+ on a graphics card to play your games at medium?

    4K can make up the difference of not using anti-aliasing, but it can't improve low quality visual features beyond making them look sharper.

    Quote:
    I was in microcenter the other day, one of the very few places you can actually see a 4K display physically. I have to say i wasn't impressed, everything looked small, it just looks like they shrunk the images on PC. Maybe it was just that monitor but it did not look special to the point where i would spend $500 on monitor and $650 for a new GPU.


    Sounds like you only got to see the monitor in a Windows environment, and not a gaming perspective. Windows 8.1 doesn't scale 4K displays very well, making everything much smaller, sometimes to small to work with. Windows 7 is even worse.

    Windows 10 is supposed to have better scaling for 4K displays, but I haven't personally had a chance to verify this so take that as you will.

    I'd like to point out that before LCD screen were popular, 19- and 20-inch monitors used to have ridiculously high resultions that made everything tiny. This was the norm back then, and even saught after. It's only since the immergence of 4K screen that we've gone full circle in that regard. People are just used to larger icons and text now.


    Quote:
    I hope you can add the 5.7 driver results.


    Aside from the spot testing we mentioned, we did not get a chance to fully benchmark the card with that driver before the samples we both had were sent back. My sample was gone before the driver was even released. Igor only had a few hours with it at the time.

    Future reviews will of course run the newer driver so we'll those tests for you as soon as we can.
  • Bloob
    Quote:
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.

    Roughly: ultra settings define the quality of the content, while resolution defines your view of it. 4K just makes the picture sharper, if you currently notice no difference between 1080p with AA and 1080p without AA, then you won't notice much of a difference with 4K either. Personally I've always preferred a smaller resolution with better settings.
  • FritzEiv
    Quote:
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!

    Quote:
    fyi - typos in verdict: should be "has proven" and "fewer texture units"


    "Has proved" is actually correct, but just in case I'm checking again with our copy chief. Good catch on the other, and you made me find another (it should be "door" singular rather than "doors" plural).

    - Fritz Nelson, Editor-in-chief
  • sicom
    Slapping on a standard tri-cooler onto this short board is effing dumb.
  • FritzEiv
    Quote:
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!


    I agree!!! Since this is part of a product widget in our system, it's actually a dev change rather than a pure design change. At some point, these went from being transparent to being opaque. We've asked for it to be changed back, and it is in the dev queue I am told. Still, comments like these help me coax these changes along.

    - Fritz Nelson, Editor-in-chief
  • TechyInAZ
    Looks great! I like that AMD has not made a reference design. NVidia, learn from AMD. :P
  • 10tacle
    1943658 said:
    I'd like to point out that before LCD screen were popular, 19- and 20-inch monitors used to have ridiculously high resolutions that made everything tiny. This was the norm back then, and even sought after. It's only since the emergence of 4K screen that we've gone full circle in that regard. People are just used to larger icons and text now.


    Thank you! I keep having to remind myself that in the late '90s and into the early '00s, a portion of your readers were in grade school and weren't tech savvy back then to appreciate the changes we've experienced over the past 15 years. Circa 2000, your typical household PC (all 50% of the households who actually had one back then) had a 15-17" CRT monitor with a resolution of 1024x768 or 1280x1024 respectively. The high end monitors of the time were 19"-21" 1600x1200 resolution CRTs. It was huge jump going from 1280x1024 to 1600x1200 for the time, as minor a jump in resolution as that sounds today.
  • confus3d
    675037 said:
    Quote:
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.
    Roughly: ultra settings define the quality of the content, while resolution defines your view of it. 4K just makes the picture sharper, if you currently notice no difference between 1080p with AA and 1080p without AA, then you won't notice much of a difference with 4K either. Personally I've always preferred a smaller resolution with better settings.


    I agree as this has been my experience but I haven't yet had the opportunity to try gaming on a 4k setup yet. My personal preference would be 1080p ultra with high frame rates rather than 4k medium at 30fps.
  • kcarbotte
    Quote:
    Slapping on a standard tri-cooler onto this short board is effing dumb.


    Considering the temperatures being reached with even this large of a cooler, I completely disagree with you there.
    with a chip that generates this much heat you have 2 options. Loud fans that move a tonn of air, giant heatsink and quiet fans.

    I think most people have a big enough case that the size doens't make much difference, and would rather have quiet operation.

    There will likely be options from other vendors that go the oposite direction.
  • Gurg
    This was a pretty lame bogus review. I did a side by side window comparison with the factory overclocked reviews of all cards from the June 18 AMD R9 390x and the results didn't come close to matching up. This was evidently done at very low resolutions and while it appears the non-Fury cards were not run at the factory overclocked settings; while the Fury was overclocked.

    You did a great review of the MSI 390x vs the MSI 980x factory overclocked at the ultra game settings in June, but really dropped the ball here. Except for major changes to your standard computer setup (ie new mb, cpu ram on an annual basis), driver updates and the inclusion of occasional new games; we should be able to compare previous testing results for cards on your standard bench system to the newest reviews realizing that only drivers may be different.

    When you drastically change your testing methodology it really brings into question your motivations. Personally the June review nailed it with ultra game settings and the factory overclocked card settings for all cards telling me what relative performance to expect if I bring a high end card home and run it on my system and monitor resolution. You don't have to be a computer overclock guru to move game setting to ultra and click the factory overclock setting on high level cards at home and receive similar results.