Sign in with
Sign up | Sign in

Bringing It All Together: The Tahiti GPU And Radeon HD 7970

AMD Radeon HD 7970: Promising Performance, Paper-Launched
By , Igor Wallossek

The Tahiti GPU in AMD’s Radeon HD 7970 plays host to 32 CUs. With 64 ALUs per CU, that adds up to a total of 2048 total ALUs. Do the multiplication, assuming a 925 MHz core clock and you have a GPU capable of about 3.8 TFLOPS of 32-bit math and 947 double-precision GFLOPS. The L1 cache offers about 2 TB/s of bandwidth at this card’s clock rate, backed by a larger 768 KB L2 cache.

There are eight render back-ends capable of 32 full-color raster operations per clock, the same as Radeon HD 6970. But while the raw specifications are identical, efficiency is improved by virtue of six 64-bit memory controllers yielding a wider 384-bit memory interface. Between the expanded bus and faster 1375 MHz GDDR5 memory, the Radeon HD 7970 boasts an impressive 264 GB/s of memory bandwidth, which is roughly 100 GB/s more than the Radeon HD 6970.

Revamped Tessellation Engines

Each GPU has two revamped geometry engines optimized for tessellation. Though they’re still limited to 2 billion vertices, AMD claims a 1.7x to 4x performance increase, depending on the number of subdivisions applied to the source primitive. The large parameter buffer cache has also been increased.

PowerTune and ZeroCore

PowerTune should be familiar from our Radeon HD 6900-series launch coverage. To recap, the feature monitors work performed by the GPU and adjusts frequencies so that the board only uses the power that its maximum TDP allows. According to AMD, without PowerTune, the Radeon HD 7970’s core would have to be cut to about 720 MHz in order to fit within the 250 W envelope, taking worst-case scenarios into account.

The Tahiti GPU does have new power management functionality up its silicon-encrusted sleeve though, and it’s called ZeroCore technology. Comprised of several developments, including a deep sleep mode to reduce GPU consumption, a DRAM stutter mode to reduce memory power, and the ability to compress the frame buffer’s contents, ZeroCore has a measurable effect on draw during idle and monitor-off situations. AMD claims that the card only consumes 15 W in a static Windows environment, and its GPU is completely turned off when the monitor is not in use. The fan even stops, and minimal heat is dissipated. Our power readings confirm that the Radeon HD 7970 uses notably less power than the Radeon HD 6970 at idle.

CrossFire users—the folks who often have to contend with the biggest thermal issues—will welcome another component of ZeroCore that turns off the second, third, or fourth board in a multi-GPU environment when they’re not needed. With supply of available Radeon HD 7970s painfully low and only a single sample available to test, we cannot yet confirm that this feature works as AMD is advertising. However, it’s on our list of things to double-check when the company’s supply stabilizes.

PCI Express 3.0

Data is fed into and from the GPU through PCI Express, of course, and AMD’s Radeon HD 7970 is the first graphics card to boast compatibility with the third-gen standard. Frankly, today’s desktop software cannot seem to saturate PCI Express 2.0 slots, even when they’re halved into eight-lane links. So, we doubt we’ll see any performance increase from the interface currently only supported by Intel’s Core i7-3000-series processors. However, AMD hints that the 16 GB/s of bidirectional bandwidth may help compute applications in some cases. Again, though, vendors aren’t even ready to show off the applications they demonstrated at the Radeon HD 7970 briefing, and we weren’t given enough time to test the effects of third-gen PCI Express ahead of today’s embargo anyway.

Meet Radeon HD 7970

At 10.5” long by 4.5” tall, the Radeon HD 7970’s PCB is exactly the same size as the 6970. It appears smaller, though, thanks to a design trick used by automakers and Apple: the heat sink is tapered at the end.

Despite the dimensional similarity, there are some notable differences between AMD’s successive single-GPU flagships. The back of the new card isn’t covered by a metal reinforcement plate, for starters. Moreover, its axial fan intake is just under three inches, while the Radeon HD 6970’s is about two and a half.

Speaking of that fan, it has larger, wider blades designed for better airflow at lower rotational speeds. Despite the seeming improvement, our experience with fan noise was not a positive one. Check our noise benchmarks for more. AMD uses a newer version of the phase-changing thermal interface material (essentially a type of thermal paste) used in the Radeon HD 6990 to mate the cooler to the GPU. The two-step, three-level vapor chamber purportedly has an easier job pushing air out of the back of the card because AMD removed the stacked DVI connector for better airflow.

So, with the second DVI connector removed, what’s left? The reference card comes with two mini-DisplayPort outputs, an HDMI output, and one dual-link DVI output. Before you get too torn up about a triple-monitor Eyefinity setup requiring an expensive investment in DVI adapters, there’s some good news: AMD says it's going to bundle an HDMI-to-DVI adapter and active mini-DisplayPort-to-DVI adapter with its cards. So, despite the loss of one previously-valuable connector, triple-monitor users should be better-supported now than they were with the Radeon HD 6970. Of course, that's not to say the company's add-in board partners will be as generous. Do your homework before picking a brand and make sure those extras come bundled before jumping on the lowest price.

With the cooler removed, you can see the large plate protecting the GPU, designed to resist the warping seen in previous-generation products. Here are some shots of the naked card and exposed GPU, posing in unabashed glory.

Note the six- and eight-pin power connectors, similar to the Radeon HD 6970. These should come as no surprise considering the similar TDP of both cards, although AMD claims the power delivery is overkill for everyone except overclockers. Also pay attention to the dual BIOS switch, which is another welcome carry-over from the 6900 series that facilitates firmware tweaking with a little less risk. 

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 205 comments.
This thread is closed for comments
Top Comments
  • 42 Hide
    danraies , December 22, 2011 3:49 AM
    cangeliniStart treating your SO super-nice and ask for one for Valentine's Day!


    If I ever find someone that will buy me a $500 graphics card for Valentine's Day I'll be proposing on the spot.
  • 30 Hide
    thepieguy , December 22, 2011 3:32 AM
    If Santa is real, there will be one of these under my Christmas tree in a few more days.
  • 25 Hide
    cangelini , December 22, 2011 4:01 AM
    ZombeeSlayer143WOW!!! I love the conslusion; all of it, which basically is interpretted as "I'm biased towards Nvidia," and trys [desperately] to say don't buy this card! Has the nerve to mention Kepler as an alternative; right, Kepler, as in 1 year away. The GTX580 just got "Radeon-ed" in it's rear. I'm not biased towards either manufacturer, just love to see and give credit to a team of people with passion, vision, and hardwork come together and put their company back on the map, as is shown here today with AMD's launch of the 7970. It's AMD's version of "Tebow Time!!"


    Not at all, actually. It offers great gaming performance, and for everything else...we have to wait, because it's not fully baked yet. They say fools rush in. This thing launched today. We were given less than a week with it. It won't be available to even *buy* for another three weeks. Are you telling me you're ready to crown a paper winner, and you're not biased? Please...
Other Comments
  • 30 Hide
    thepieguy , December 22, 2011 3:32 AM
    If Santa is real, there will be one of these under my Christmas tree in a few more days.
  • 12 Hide
    mi1ez , December 22, 2011 3:40 AM
    Damn, that's a good looking GPU!
  • 11 Hide
    cangelini , December 22, 2011 3:40 AM
    a4mulaFrom a gaming standpoint I fail to see where this card finds a home. For 1920x1080 pretty much any card will work, meanwhile at Eyefinity resolutions it's obvious that a single gpu still isn't viable. Perhaps this will be something that people would consider over 2x 6950, but that isn't exactly an ideal setup either. While much of the article was over my head from a technical standpoint, I hope the 7 series addresses microstuttering in crossfire. If so than perhaps 2x 7950 (Assuming a 449$) becomes a viable alternative to 3x 6950 2GB. I was really hoping we'd see the 7970 in at 449, with the 7950 in at 349. Right now I'm failing to see the value in this card.


    I'll be trolling Newegg for the next couple weeks on the off-chance they pop up before the 9th. A couple in CrossFire could be pretty phenomenal, but it remains to be seen if they maintain the 6900-series scalability.
  • 7 Hide
    cangelini , December 22, 2011 3:41 AM
    thepieguyIf Santa is real, there will be one of these under my Christmas tree in a few more days.


    Hate to break it to you, but there won't be, unless you celebrate Christmas in mid-January.

    Start treating your SO super-nice and ask for one for Valentine's Day!
  • 16 Hide
    Darkerson , December 22, 2011 3:48 AM
    Well I know what I want at tax time :D 
  • 42 Hide
    danraies , December 22, 2011 3:49 AM
    cangeliniStart treating your SO super-nice and ask for one for Valentine's Day!


    If I ever find someone that will buy me a $500 graphics card for Valentine's Day I'll be proposing on the spot.
  • -3 Hide
    Zombeeslayer143 , December 22, 2011 3:56 AM
    No hard feelings to the author...thanks for the review nonethless..
  • 25 Hide
    cangelini , December 22, 2011 4:01 AM
    ZombeeSlayer143WOW!!! I love the conslusion; all of it, which basically is interpretted as "I'm biased towards Nvidia," and trys [desperately] to say don't buy this card! Has the nerve to mention Kepler as an alternative; right, Kepler, as in 1 year away. The GTX580 just got "Radeon-ed" in it's rear. I'm not biased towards either manufacturer, just love to see and give credit to a team of people with passion, vision, and hardwork come together and put their company back on the map, as is shown here today with AMD's launch of the 7970. It's AMD's version of "Tebow Time!!"


    Not at all, actually. It offers great gaming performance, and for everything else...we have to wait, because it's not fully baked yet. They say fools rush in. This thing launched today. We were given less than a week with it. It won't be available to even *buy* for another three weeks. Are you telling me you're ready to crown a paper winner, and you're not biased? Please...
  • 10 Hide
    Zombeeslayer143 , December 22, 2011 4:04 AM
    cangeliniHate to break it to you, but there won't be, unless you celebrate Christmas in mid-January.Start treating your SO super-nice and ask for one for Valentine's Day!


    Dude!!! I'm will be glued to NewEgg via my PDA and PC...I'm sooooo ooohhhh getting two of these!! If they sell for $549, I still think they provide the best value based on current high-end single GPU solutions (e.g., GTX 580, 6970, etc). Nvidia may want to consider a 20%-30% price reduction on the 580's once the 7970's are out.
  • -8 Hide
    Zombeeslayer143 , December 22, 2011 4:15 AM
    ZombeeSlayer143Dude!!! I'm will be glued to NewEgg via my PDA and PC...I'm sooooo ooohhhh getting two of these!! If they sell for $549, I still think they provide the best value based on current high-end single GPU solutions (e.g., GTX 580, 6970, etc). Nvidia may want to consider a 20%-30% price reduction on the 580's once the 7970's are out.


    You have a valid argument there. However, it's not like we need to wait for Windows 8 to see what this GPU is made of, as was the case with Bulldozer. What you see is what we will get give or take 5%?????? Heck, I don't even care for the overclocking potential. If the performance was capped at what is shown here today, I'm certain everyone will be happy. But, of course thats not the case, which brings me back to your point; it's too early, don't get excited, it's just on paper right? LOL...sure buddy...I'm so glad NewEgg was sold out of the GTX580 Classified's the past few weeks; would have regretted that purchase.
  • 6 Hide
    masterofevil22 , December 22, 2011 4:15 AM
    Ahhh....I'll be on the EGG myself and on E of Bay with my 6950 (unlocked) the day this beast this arrives!!!!!!! :D 

    WOW...Go ATI!!!!
  • 15 Hide
    jdwii , December 22, 2011 4:18 AM
    Well Amd you did it, Wow it's been a while since i was able to say that, I'm so impressed with these results everything you guys used to fail on you fixed at your GPU division. I'm not going to lie i did not think Amd would pay it off but they did. All i can say is Nice and well done. You guys beat Nvidia with a 28nnm gpu. I just wonder what's going on at nvidia and why their behind on the 28nm die, And their is rumors that Nvidia wont get their new card out until 2Q of 2012!!


    To bad these cards wont be out this year they would make a lot of money. Oh well parents usually buy them self stuff after christmas i guess.
  • 23 Hide
    aznshinobi , December 22, 2011 4:23 AM
    OMG DAT OVERCLOCKING!
    I can't wait to get the 7850 or 7870 and OC the crap out of it!
  • 10 Hide
    Zero_ , December 22, 2011 4:28 AM
    Overclocking like a baws! It DOES beat a GTX590!
  • 13 Hide
    bavman , December 22, 2011 4:42 AM
    Damn, I didn't see this coming from AMD. This card is a beast...it takes on multi-gpu cards...price is a little high, but compared to the performance it looks like an amazing value.
  • 19 Hide
    hardcore_gamer , December 22, 2011 4:48 AM
    I'm getting one of these for my <30nm build (28 nm GPU and 22nm CPU). I'm callin it the "MOORE'S AR$E"
Display more comments