AMD Radeon HD 7970 GHz Edition Review: Give Me Back That Crown!

Temperature And Noise

The Radeon HD 7970 GHz Edition is only slightly warmer and a skosh louder than its predecessor at idle. But we’re not concerned about idle. The last time we looked at Radeon HD 7970, we were most worried about its bad behavior under a load.

Although we’re not worried about an 84 degree load temperature, it is nine degrees higher than the retail Radeon HD 7970 we have in the lab (and the hottest-running single-GPU card we have).

Much more troubling is the noise generated by AMD’s problematic reference cooler. I even had to go back and re-test because it seemed inconceivable that the company would ship out cards that were even louder than the Radeon HD 6990.

There’s a silver lining on this one, though. Ahead of this review, I let AMD know about our acoustic concerns and the company claims that most partner boards will employ third-party cooling, not its reference configuration. Just a little earlier this week we saw in Radeon HD 7950 3 GB: Six Cards, Benchmarked And Reviewed that new heat sinks and fans can work wonders on Tahiti-based boards. Fingers crossed, then, that the Radeon HD 7970 GHz Editions that show up on store shelves don’t sound like our sample.

Of course, you don't have to wait for board partners to work their magic...

  • esrever
    50 mhz boosts are kinda low imo
    Reply
  • Darkerson
    My only complaint with the "new" card is the price. Otherwise it looks like a nice card. Better than the original version, at any rate, not that the original was a bad card to begin with.
    Reply
  • mayankleoboy1
    Thanks for putting my name in teh review :D

    now if only you could bold it :lol:
    Reply
  • wasabiman321
    Great I just ordered a gtx 670 ftw... Grrr I hope performance gets better for nvidia drivers too :D
    Reply
  • mayankleoboy1
    nice show AMD !

    with Winzip that does not use GPU, VCE that slows down video encoding and a card that gives lower min FPS..... EPIC FAIL.
    or before releasing your products, try to ensure S/W compatibility.
    Reply
  • hellfire24
    not trying to be a fanboy but "Still GTX 670 gives you best BANG FOR DA BUCK!"
    Reply
  • vmem
    jrharbortTo me, increasing the memory speed was a pointless move. Nvidia realized that all of the bandwidth provided by GDDR5 and a 384bit bus is almost never utilized. The drop back to a 256bit bus on their GTX 680 allowed them to cut cost and power usage without causing a drop in performance. High end AMD cards see the most improvement from an increased core clock. Memory... Not so much.Then again, Nvidia pretty much cheated on this generation as well. Cutting out nearly 80% of the GPGPU logic, something Nvidia had been trying to market for YEARS, allowed then to even further drop production costs and power usage. AMD now has the lead in this market, but at the cost of higher power consumption and production cost.This quick fix by AMD will work for now, but they obviously need to rethink their future designs a bit.
    the issue is them rethinking their future designs scares me... Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite. True, Nvidia is able to produce more gaming performance for less, but this is pushing anyone who wants GPU compute to get an overpriced professional card. now before you say "well if you're making a living out of it, fork out the cash and go Quadro", let me remind you that a lot of innovators in various fields actually do use GPU compute to ultimately make progress (especially in academic sciences) to ultimately bring us better tech AND new directions in tech development... and I for one know a lot of government funded labs that can't afford to buy a stack of quadro cards
    Reply
  • andrewcarr
    So happy :)
    Reply
  • DataGrave
    Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite.
    100% acknowledge

    And for the gamers: take a look at the new UT4 engine! Without excellent GPGPU performace this will be a disaster for each graphics card. See you, Nvidia.
    Reply
  • cangelini
    mayankleoboy1Thanks for putting my name in teh review now if only you could bold it;-)
    Excellent tip. Told you I'd look into it!
    Reply