AMD Radeon HD 7970 GHz Edition Review: Give Me Back That Crown!

Radeon HD 7970 GHz Edition Gets Our Aftermarket Cooling Treatment

We're not going to recommend against buying a card with AMD's reference cooler without trying to come up with alternatives.

Considering that our sample spun up to 84 degrees C in an air-conditioned room in a relatively modest looping test that doesn't apply a constant load, we don't even want to know how it'd sound (or perform) during an extended gaming session.

The following chart shows the temperature curve of our reference card running a far more demanding FurMark workload at four different clock speeds over a period of four minutes.

Regardless of how noisy AMD's solution is, can an aftermarket card ameliorate this thermal situation? We decided to go with a cooler in the $50 range, which still ends up pricey when you add it to the card's $500 cost. Spending anything more just to counter a problem with the reference design, we think, is wasteful.

Is Gelid's Icy Vision-A A Good Compromise Between Performance And Price?

Gelid’s Icy Vision isn’t really a particularly new design, but it's now officially validated for the Radeon HD 7970. How well does it handle almost 250 W of heat dissipation under full load?

After we finished installing the new cooler, we re-ran our tests and compared the results to AMD’s reference design.

Gelid’s Icy Vision-A doesn't deliver the same cooling efficiency as what we've seen from companies like MSI, HIS, or Gigabyte, but the improvement is both quantifiable in the above chart, and in our acoustic testing. In light of its moderate price, we believe this cooler is a reasonable choice that helps complement AMD's hardware. This is the treatment we're hoping to see from board partners.

Of course, the bars on a chart don’t really tell you much about the actual noise a specific cooler produces. That’s where our videos come in, allowing our readers to compare both cooling solutions directly.

First, our Radeon HD 7970 GHz Edition with Gelid's Icy Vision-A:

AMD's Radeon HD 7970 GHz Edition with Gelid Icy Vision-A

Then, our Radeon HD 7970 GHz Edition with AMD's reference cooler:

AMD's Radeon HD 7970 GHz Edition with Reference Cooler

What a difference. The bottom line is that $50 bucks absolutely gets you a better cooling solution. In our opinion, you're better off saving the $50 AMD is trying to make on its GHz Edition card. Instead, buy the original Radeon HD 7970 and spend the leftover money on a cooler like this. You'll achieve similar performance at a similar price, but get less noise and better thermals. The only sacrifice is a loss of warranty.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
259 comments
    Your comment
    Top Comments
  • EzioAs
    AMD's Driver team really deserve praise this time. Kudos AMD!
    32
  • vmem
    jrharbortTo me, increasing the memory speed was a pointless move. Nvidia realized that all of the bandwidth provided by GDDR5 and a 384bit bus is almost never utilized. The drop back to a 256bit bus on their GTX 680 allowed them to cut cost and power usage without causing a drop in performance. High end AMD cards see the most improvement from an increased core clock. Memory... Not so much.Then again, Nvidia pretty much cheated on this generation as well. Cutting out nearly 80% of the GPGPU logic, something Nvidia had been trying to market for YEARS, allowed then to even further drop production costs and power usage. AMD now has the lead in this market, but at the cost of higher power consumption and production cost.This quick fix by AMD will work for now, but they obviously need to rethink their future designs a bit.


    the issue is them rethinking their future designs scares me... Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite. True, Nvidia is able to produce more gaming performance for less, but this is pushing anyone who wants GPU compute to get an overpriced professional card. now before you say "well if you're making a living out of it, fork out the cash and go Quadro", let me remind you that a lot of innovators in various fields actually do use GPU compute to ultimately make progress (especially in academic sciences) to ultimately bring us better tech AND new directions in tech development... and I for one know a lot of government funded labs that can't afford to buy a stack of quadro cards
    26
  • DataGrave
    Quote:
    Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite.
    100% acknowledge

    And for the gamers: take a look at the new UT4 engine! Without excellent GPGPU performace this will be a disaster for each graphics card. See you, Nvidia.
    26
  • Other Comments
  • esrever
    50 mhz boosts are kinda low imo
    14
  • Darkerson
    My only complaint with the "new" card is the price. Otherwise it looks like a nice card. Better than the original version, at any rate, not that the original was a bad card to begin with.
    6
  • mayankleoboy1
    Thanks for putting my name in teh review :D

    now if only you could bold it :lol:
    2
  • wasabiman321
    Great I just ordered a gtx 670 ftw... Grrr I hope performance gets better for nvidia drivers too :D
    6
  • mayankleoboy1
    nice show AMD !

    with Winzip that does not use GPU, VCE that slows down video encoding and a card that gives lower min FPS..... EPIC FAIL.
    or before releasing your products, try to ensure S/W compatibility.
    -22
  • hellfire24
    not trying to be a fanboy but "Still GTX 670 gives you best BANG FOR DA BUCK!"
    16
  • vmem
    jrharbortTo me, increasing the memory speed was a pointless move. Nvidia realized that all of the bandwidth provided by GDDR5 and a 384bit bus is almost never utilized. The drop back to a 256bit bus on their GTX 680 allowed them to cut cost and power usage without causing a drop in performance. High end AMD cards see the most improvement from an increased core clock. Memory... Not so much.Then again, Nvidia pretty much cheated on this generation as well. Cutting out nearly 80% of the GPGPU logic, something Nvidia had been trying to market for YEARS, allowed then to even further drop production costs and power usage. AMD now has the lead in this market, but at the cost of higher power consumption and production cost.This quick fix by AMD will work for now, but they obviously need to rethink their future designs a bit.


    the issue is them rethinking their future designs scares me... Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite. True, Nvidia is able to produce more gaming performance for less, but this is pushing anyone who wants GPU compute to get an overpriced professional card. now before you say "well if you're making a living out of it, fork out the cash and go Quadro", let me remind you that a lot of innovators in various fields actually do use GPU compute to ultimately make progress (especially in academic sciences) to ultimately bring us better tech AND new directions in tech development... and I for one know a lot of government funded labs that can't afford to buy a stack of quadro cards
    26
  • andrewcarr
    So happy :)
    4
  • DataGrave
    Quote:
    Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite.
    100% acknowledge

    And for the gamers: take a look at the new UT4 engine! Without excellent GPGPU performace this will be a disaster for each graphics card. See you, Nvidia.
    26
  • cangelini
    mayankleoboy1Thanks for putting my name in teh review now if only you could bold it

    ;-)
    Excellent tip. Told you I'd look into it!
    7
  • scrumworks
    When do you actually start using new games for benchmark? Let me give you a hint, WoW is not a new game and neither it's performance is any meaningful, because it's badly coded and looking a game and graphics are lousy Quake 2 level. See it yourself what kind of game they are testing here: http://tinyurl.com/dxarebj
    -3
  • esrever
    could you do power consumption with a game instead of 3dmark? it seems these cards uses much less power durring gaming and 3dmark doesn't give a realistic showing. Maybe have both graphs would be nice so you know the maximal as well as the general gaming power use.
    7
  • DataGrave
    Quote:
    Maybe have both graphs would be nice so you know the maximal as well as the general gaming power use.
    Page 2 - 4
    0
  • EzioAs
    AMD's Driver team really deserve praise this time. Kudos AMD!
    32
  • cangelini
    scrumworksWhen do you actually start using new games for benchmark? Let me give you a hint, WoW is not a new game and neither it's performance is any meaningful, because it's badly coded and looking a game and graphics are lousy Quake 2 level. See it yourself what kind of game they are testing here: http://tinyurl.com/dxarebj

    WoW is meaningful, actually.
    New games will make it in when vendors start giving us more than two or three days to retest all of their graphics cards :)
    10
  • vrumor
    recon you are about as big of a fanboy of nvidia as people who are disliking your comment are of AMD. So before ya spout off about fanboi'ism, look in the mirror bud. I have a 7970 and until I read this I would tell people all day to get a 670. So I guess it holds true, truth hurts huh?
    7
  • sarinaide
    Very nice work, basically upping the way lower than Nvidia clocks to restore parity in performance. Obviously the buffed up speeds will increase the load power but the idle power and low state draws still are industry leading. A very good update.
    2
  • masterjaw
    Either way, this will be good for the competition and for us consumers.

    And here's me, hoping that this kind of competition landscape would still be present in the enthusiast CPU market.
    5
  • xtreme5
    what????????? AMD over the GTX 680 no no, it's not possible noooooooooooooooooooooooo....
    -9
  • sarinaide
    recon-ukMeh at best.



    I really hope this is trolling, rather than a calous endeavor to discredit a competitors product.
    8