Skip to main content

AMD Radeon HD 7970 GHz Edition Review: Give Me Back That Crown!

Radeon HD 7970 Vs. Radeon HD 7970 GHz Edition

We start off by testing both cards at 1050 and 925 MHz. Since our samples are completely stable at those frequencies, we didn’t have to touch the PowerTune slider at all. The new card didn’t throttle, either, yielding an ideal comparison. As before, we logged power consumption for 50 seconds, using a gaming workload this time.

The dotted lines represent one card running at the emulated clock speeds of the other. And the final analysis yields an interesting result: the older and supposedly less-refined card draws marginally less power at 1050 MHz. It does even better at 925 MHz, coming in almost 5 W under the GHz Edition board. Perhaps this is a result of AMD’s voltage-adding mechanism designed to keep Tahiti more stable at its boost frequency.

However, we’re still not applying a full load to either card. Our next test does just that by applying a compute workload that doesn’t trigger throttling.

Power draw is pretty similar between the two boards. The new card might do its job under the TDP ceiling defined for the original 7970, but AMD’s GHz Edition board definitely doesn’t offer more performance at the same power levels as its predecessor. If you want more speed, you have to use more power.

  • esrever
    50 mhz boosts are kinda low imo
    Reply
  • Darkerson
    My only complaint with the "new" card is the price. Otherwise it looks like a nice card. Better than the original version, at any rate, not that the original was a bad card to begin with.
    Reply
  • mayankleoboy1
    Thanks for putting my name in teh review :D

    now if only you could bold it :lol:
    Reply
  • wasabiman321
    Great I just ordered a gtx 670 ftw... Grrr I hope performance gets better for nvidia drivers too :D
    Reply
  • mayankleoboy1
    nice show AMD !

    with Winzip that does not use GPU, VCE that slows down video encoding and a card that gives lower min FPS..... EPIC FAIL.
    or before releasing your products, try to ensure S/W compatibility.
    Reply
  • hellfire24
    not trying to be a fanboy but "Still GTX 670 gives you best BANG FOR DA BUCK!"
    Reply
  • vmem
    jrharbortTo me, increasing the memory speed was a pointless move. Nvidia realized that all of the bandwidth provided by GDDR5 and a 384bit bus is almost never utilized. The drop back to a 256bit bus on their GTX 680 allowed them to cut cost and power usage without causing a drop in performance. High end AMD cards see the most improvement from an increased core clock. Memory... Not so much.Then again, Nvidia pretty much cheated on this generation as well. Cutting out nearly 80% of the GPGPU logic, something Nvidia had been trying to market for YEARS, allowed then to even further drop production costs and power usage. AMD now has the lead in this market, but at the cost of higher power consumption and production cost.This quick fix by AMD will work for now, but they obviously need to rethink their future designs a bit.
    the issue is them rethinking their future designs scares me... Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite. True, Nvidia is able to produce more gaming performance for less, but this is pushing anyone who wants GPU compute to get an overpriced professional card. now before you say "well if you're making a living out of it, fork out the cash and go Quadro", let me remind you that a lot of innovators in various fields actually do use GPU compute to ultimately make progress (especially in academic sciences) to ultimately bring us better tech AND new directions in tech development... and I for one know a lot of government funded labs that can't afford to buy a stack of quadro cards
    Reply
  • andrewcarr
    So happy :)
    Reply
  • DataGrave
    Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite.
    100% acknowledge

    And for the gamers: take a look at the new UT4 engine! Without excellent GPGPU performace this will be a disaster for each graphics card. See you, Nvidia.
    Reply
  • cangelini
    mayankleoboy1Thanks for putting my name in teh review now if only you could bold it;-)
    Excellent tip. Told you I'd look into it!
    Reply