Radeon HD 6970 And 6950 Review: Is Cayman A Gator Or A Crock?

Power Consumption And Noise

Both of AMD’s Radeon HD 6900-series cards give us power profiles that look a lot like existing boards in our three-loop logged run of Metro 2033 at 2560x1600 using Very High quality settings, 4x MSAA, and anisotropic filtering. The Radeon HD 6970 is very similar to the GeForce GTX 570, and the Radeon HD 6950 traces very close to the Radeon HD 5870.

A closer look at the averages confirms this. The Radeon HD 6970’s average system power in this test is 321 W, while the GTX 570-based system consumes 329 W. The Radeon HD 6950 averages 279 W and the Radeon HD 5870 averages 274 W. 

When the Radeon HD 6900s aren’t in use, they throttle down to a 250 MHz core clock and 150 MHz memory frequency, saving power.

Nvidia stepped it up with regard to the thermal and acoustic management of its high-end cards. After the poor reception of GeForce GTX 480, both the GeForce GTX 580 and 570 were able to top performance charts and come in at the bottom of our noise benchmark.

AMD’s Radeon HD 6970 and 6950 also show well in this regard. Though the GeForce GTX 570 remains the quietest card in our comparison, it’s followed closely by the Radeon HD 6950. A couple of other configurations slide in ahead of the 6970, but it’s also a very subtle presence in any performance PC.

The one caveat I’ll throw down here—and this really has to apply to any gaming machine—is that you want these cards at least three expansion slots apart if you’re planning to go with a CrossFire setup, leaving enough room for airflow between the first and second boards. Should you cram them back-to-back, expect much less desirable acoustics. We measured a pair of Radeon HD 6970s at 53.9 dB(A), compared to two GeForce GTX 570s at 49.6 dB(A) in such a not-recommended configuration. While I've been talking about four slots worth of expansion dedicated to dual-card setups up until now, it's really more realistic to think of any CrossFire or SLI array as needing to populate five slots.

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • terror112
    WOW not impressed.
    Reply
  • Annisman
    Thanks for the review Angelini, these new naming schemes are hurting my head, sometimes the only way to tell (at a quick glance) which AMD card matches up to what Nvidia card, is by comparing the prices, which I think is bad for the average consumer.
    Reply
  • rohitbaran
    These cards are to GTX 500 series what 4000 series was to GTX 200. Not the fastest at their time but offer killer performance and feature set for the price. I too expected 6900 to be close to GTX 580, but it didn't turn out that way. Still, it is the card I have waited for to upgrade. Right in my budget.
    Reply
  • tacoslave
    imagine when this hits 32nm?
    Reply
  • notty22
    AMD's top card is about a draw with the gtx 570.
    Pricing is in line.
    Gives AMD only hold outs buying options, Nvidia already offered
    Merry Christmas
    Reply
  • microterf
    Why drop the 580 when it comes to the multi-gpu scaling??
    Reply
  • IzzyCraft
    Sorry all i read was this
    "This helps catch AMD up to Nvidia. However, Intel has something waiting in the wings that’ll take both graphics companies by surprise. In a couple of weeks, we'll be able to tell you more." and now i'm fixated to weather or not intel's gpu's can actually commit to proper playback.
    Reply
  • andrewcutter
    but from what i read at hardocp, though it is priced alongside the 570, 6970 was benched against the 580 and they were trading blows... So toms has it at par with 570 but hard has it on par with 580.. now im confused because if it can give 580 perfomance or almost 580 performance at 570 price and power then this one is a winner. Sim a 6950 was trading blows with 570 there. So i am very confused
    Reply
  • sgt bombulous
    This is hilarious... How long ago was it that there were ATI fanboys blabbering "The 6970 is gonna be 80% faster than the GTX 580!!!". And then reality hit...
    Reply
  • manitoublack
    I'd have to say wait until the christmas new years dust settles
    Reply