AMD Radeon R9 Fury Review: Sapphire Tri-X Overclocked

Product 360

Sapphire’s Radeon R9 Fury Tri-X makes use of a custom heat sink design featuring seven copper pipes of varying thickness and two separate sections of fins to dissipate thermal energy generated by the GPU and HBM. The central pipe is a gargantuan 10mm thick; two 8mm pipes flank it on each side, and these all span from the main section of fins over the GPU to a separate group of rear fins. The remaining two pipes are 6mm thick, each making a single loop back through the fins over the GPU contact block.

The card's PCB is quite short at only 12cm. However, the heat sink and shroud extend well beyond the back of the board. In fact, they nearly double its length, taking it to 23.5cm. The rear section of the cooler is wide open, with only a die-cast exoskeleton holding it in place. This facilitates significant airflow through the heat sink fins.

Sapphire said it was targeting a load temperature of less than 75 degrees C and modest acoustics as it designed the Tri-X cooler. The company uses three dual ball-bearing fans managed by advanced profiles, which ramp up the fans slowly in order to maintain silence when possible. Under normal load, the fans should spin at 40% or less, though they can be manually adjusted if more cooling capacity is desired. 

Not only is the R9 Fury Tri-X quite long, but it is also very thick. The card measures 5cm from the shroud to the screws sticking out of its back plate. Clearance may be an issue for some motherboards. It came close to not fitting in our reference board's first PCIe slot. If the back plate was 1mm thicker, it would not have worked.

Despite the card being technically capable of running four-way CrossFire, the heat sink blocks a neighboring PCIe slot. So, short of using riser cables, you’re limited to double-spaced setups unless you opt to replace the sink with a water block, which Sapphire actually cautions against.

According to Sapphire, the heat sink's design is rather intricate and it was adamant that we couldn't remove it during our review. Apparently, because the HBM modules sit higher than the GPU, it's very difficult to replace the cooler correctly, which can result in poor cooling performance and possibly damage the GPU.

Along the top of the card, you’ll find two eight-pin power connectors. A row of eight LED lights are positioned next to those auxiliary inputs, and they illuminate sequentially to indicate higher load.

Sapphire also includes a BIOS toggle switch that switches between two slightly different profiles. One is tailored to a 75 degree C load temperature target and keeps power going to the GPU limited to 300W. The second option allows for an 80 degree threshold and a 350W power limit.

AMD’s Fury X was designed without a DVI port, and the Fury follows its lead. The I/O plate has three DisplayPort 1.2 connectors and one HDMI 1.4 interface. Up to four displays can be driven natively, but six are supported through MST hubs.

DVI is still supported through a DisplayPort-to-DVI adapter, which Sapphire graciously includes in its retail package. The company also adds an HDMI cable.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
108 comments
    Your comment
    Top Comments
  • vertexx
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!
    33
  • Troezar
    Some good news for AMD. A bonus for Nvidia users too, more competition equals better prices for us all.
    28
  • FritzEiv
    Quote:
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!


    I agree!!! Since this is part of a product widget in our system, it's actually a dev change rather than a pure design change. At some point, these went from being transparent to being opaque. We've asked for it to be changed back, and it is in the dev queue I am told. Still, comments like these help me coax these changes along.

    - Fritz Nelson, Editor-in-chief
    14
  • Other Comments
  • Troezar
    Some good news for AMD. A bonus for Nvidia users too, more competition equals better prices for us all.
    28
  • AndrewJacksonZA
    Kevin, Igor, thank you for the review. Now the question people might want to ask themselves is, is the $80-$100 extra for the Fury X worth it? :-)
    5
  • bjaminnyc
    My next card, nice job AMD.
    7
  • vertexx
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!
    33
  • eza
    fyi - typos in verdict: should be "has proven" and "fewer texture units"
    4
  • ern88
    I would like to get this card. But I am currently playing at 1080p, but will probably got to 1440p soon!!!!
    0
  • confus3d
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.
    2
  • rohitbaran
    This is my next card for certain. Fury X is a bit too expensive for my taste. With driver updates, I think the results will get better.
    2
  • Larry Litmanen
    Quote:
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.


    I was in microcenter the other day, one of the very few places you can actually see a 4K display physically. I have to say i wasn't impressed, everything looked small, it just looks like they shrunk the images on PC.

    Maybe it was just that monitor but it did not look special to the point where i would spend $500 on monitor and $650 for a new GPU.
    2
  • Embra
    I hope you can add the 5.7 driver results.
    0
  • Innocent_Bystander
    that's the one right there... Now if I could get off my *ss and actually upgrade my CPU from a Phenom II X4 965... :)
    5
  • ern88
    Anonymous said:
    I hope you can add the 15.7 driver results.


    Here is your answer here:

    http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69792-amd-r9-fury-performance-review-20.html
    4
  • cknobman
    This looks like the Fury to get, promising results IMO.
    3
  • Vlad Rose
    Wow, no wonder Nvidia released the 980ti. I was a bit surprised to see these benchmarks with the Fury cards beating up the 980 like they are. Nvidia really blew the wind out of AMD sails with their ti.
    5
  • kcarbotte
    Quote:
    fyi - typos in verdict: should be "has proven" and "fewer texture units"

    Very late night writing. I'm impressed that's all there was actually.

    Quote:
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.


    This is going to be relative to the game, and to the users perception. Medium settings will have lower quality shadows and lighting, both of which are not improved just be resolution alone.
    Some games will look great, and most will likely play excellent, but do you really want to spend $550+ on a graphics card to play your games at medium?

    4K can make up the difference of not using anti-aliasing, but it can't improve low quality visual features beyond making them look sharper.

    Quote:
    I was in microcenter the other day, one of the very few places you can actually see a 4K display physically. I have to say i wasn't impressed, everything looked small, it just looks like they shrunk the images on PC.

    Maybe it was just that monitor but it did not look special to the point where i would spend $500 on monitor and $650 for a new GPU.


    Sounds like you only got to see the monitor in a Windows environment, and not a gaming perspective. Windows 8.1 doesn't scale 4K displays very well, making everything much smaller, sometimes to small to work with. Windows 7 is even worse.

    Windows 10 is supposed to have better scaling for 4K displays, but I haven't personally had a chance to verify this so take that as you will.

    I'd like to point out that before LCD screen were popular, 19- and 20-inch monitors used to have ridiculously high resultions that made everything tiny. This was the norm back then, and even saught after. It's only since the immergence of 4K screen that we've gone full circle in that regard. People are just used to larger icons and text now.


    Quote:
    I hope you can add the 5.7 driver results.


    Aside from the spot testing we mentioned, we did not get a chance to fully benchmark the card with that driver before the samples we both had were sent back. My sample was gone before the driver was even released. Igor only had a few hours with it at the time.

    Future reviews will of course run the newer driver so we'll those tests for you as soon as we can.
    5
  • Bloob
    Quote:
    Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.

    Roughly: ultra settings define the quality of the content, while resolution defines your view of it. 4K just makes the picture sharper, if you currently notice no difference between 1080p with AA and 1080p without AA, then you won't notice much of a difference with 4K either. Personally I've always preferred a smaller resolution with better settings.
    3
  • FritzEiv
    Quote:
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!

    Quote:
    fyi - typos in verdict: should be "has proven" and "fewer texture units"


    "Has proved" is actually correct, but just in case I'm checking again with our copy chief. Good catch on the other, and you made me find another (it should be "door" singular rather than "doors" plural).

    - Fritz Nelson, Editor-in-chief
    1
  • sicom
    Slapping on a standard tri-cooler onto this short board is effing dumb.
    -11
  • FritzEiv
    Quote:
    When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!


    I agree!!! Since this is part of a product widget in our system, it's actually a dev change rather than a pure design change. At some point, these went from being transparent to being opaque. We've asked for it to be changed back, and it is in the dev queue I am told. Still, comments like these help me coax these changes along.

    - Fritz Nelson, Editor-in-chief
    14
  • TechyInAZ
    Looks great! I like that AMD has not made a reference design. NVidia, learn from AMD. :P
    2