Sign in with
Sign up | Sign in

AMD Radeon HD 6990 4 GB Review: Antilles Makes (Too Much) Noise

AMD Radeon HD 6990 4 GB Review: Antilles Makes (Too Much) Noise
By

Several months late and supposedly only a couple of weeks ahead of Nvidia's own dual-GPU flagship launch, AMD's Radeon HD 6990 has no trouble establishing performance superiority. But does speed at any cost sacrifice too much of the user experience?

In drag racing, they say ‘a chase is a race.’ In other words, if you floor it and the guy next to you follows suit, that’s a race, and you’d better be prepared to pay up at the finish if it’s a money contest.

Both AMD and Nvidia have ridiculous dual-GPU hot rods they’ve been tweaking and tuning for months. Understandably, they want to stay secretive about their respective power plants. But neither one seems willing to mash the pedal and risk an embarrassing second-place finish. It’s a good thing that these two companies don’t live their lives a quarter-mile at a time. I can just see Vin, shaking his head in disappointment.

But come on already, guys! The AMD Radeon HD 6990 was supposed to be a 2010 model, and here we are in March wondering if AMD overpromised during its press briefing last October. We even heard rumors that the 6990 was canceled.

Au contraire, Pierre. It looks like AMD is making the first move with its blown Charger, daring Nvidia to throw-down with a twin sequential turbo-charged Supra...you probably know it as the rumored GeForce GTX 590. We received a single Radeon HD 6990 4 GB one week ago, beta drivers a couple of days later, and updated Catalyst Application Profiles a couple of days after that. Needless to say, the benchmarking marathon that went on in our Bakersfield, CA lab made the 24 Hours of Le Mans look like kart racing at an amusement park.

Meet Radeon HD 6990 4 GB

It just sounds majestic, doesn’t it? 6990. 4 GB. Unlike anything we’ve ever seen from AMD on the desktop. But don’t let naming trickery disarm you like the beautiful rosso corsa of Ferrari’s race cars.

The Radeon HD 6990 follows in the pedigree of Radeon HD 4870 X2 and Radeon HD 5970. It’s a dual-GPU card with graphics processors running, by default, at slightly reduced clock speeds compared to the company’s fastest single-chip board. Its 4 GB of memory are divided between both ASICs. So, you’re essentially looking at two 2 GB configurations on a single PCB, running in CrossFire.

Although it was previously referred to by the code name Antilles, Radeon HD 6990 centers on two of the Cayman-based GPUs found in Radeon HD 6970 and 6950 graphics cards. If you remember from Radeon HD 6970 And 6950 Review: Is Cayman A Gator Or A Crock?, Cayman employs a slightly modified architecture, designed to extract more performance per square millimeter of die space. There are situations where this VLIW4 architecture could underperform AMD's older VLIW5 design, but the company says those situations are rare.

Old VLIW5: CypressOld VLIW5: Cypress

Bottom line: the highest-end Cayman configuration offers fewer ALUs than the most complex Cypress processor (found in the Radeon HD 5800-series cards). However, Cayman’s ALUs are more capable. For a deeper background on Cayman’s architecture, check the second page of our launch coverage.

New VLIW4: CaymanNew VLIW4: Cayman

Each Cayman GPU serves up 1536 ALUs spread across 24 SIMDs. SIMDs are tied to four texture units, totaling 96. Radeon HD 6990 utilizes Cayman in its uncut form, so you get 3072 ALUs and 192 texture units between the pair of GPUs. As mentioned, the 4 GB frame buffer is divided up, 2 GB of GDDR5 per processor, connected via a 256-bit bus.

AMD unifies the two Cayman GPUs using the exact same 48-lane PCI Express 2.0 switch from PLX found on the Radeon HD 5970. Sixteen of those lanes serve the slot interface, 16 go to GPU 1, and 16 go to GPU 2.


Radeon HD 6990
Radeon HD 6970
Radeon HD 6950
GeForce GTX 580
Manufacturing Process
40 nm TSMC40 nm TSMC
40 nm TSMC
40 nm TSMC
Die Size
2 x 389 mm²389 mm²389 mm²520 mm²
Transistors
2 x 2.64 billion2.64 billion
2.64 billion
3 billion
Engine Clock
830 MHz880 MHz
800 MHz
772 MHz
Stream Processors / CUDA Cores
3072
1536
1408
512
Compute Performance
5.1 TFLOPS
2.7 TFLOPS
2.25 TFLOPS
1.58 TFLOPS
Texture Units
192
96
88
64
Texture Fillrate
159.4 Gtex/s
84.5 Gtex/s
70.4 Gtex/s
49.4 Gtex/s
ROPs
64
32
32
48
Pixel Fillrate
53.1 Gpix/s
28.2 Gpix/s
25.6 Gpix/s
37.1 Gpix/s
Frame Buffer
4 GB GDDR5
2 GB GDDR5
2 GB GDDR5
1.5 GB GDDR5
Memory Clock
1250 MHz
1375 MHz
1250 MHz
1002 MHz
Memory Bandwidth
2 x 160 GB/s (256-bit)176 GB/s (256-bit)
160 GB/s (256-bit)
192 GB/s (384-bit)
Maximum Board Power
375 W
250 W
200 W
244 W


Of course, we’re ecstatic that AMD is using fully-functional 40 nm Cayman GPUs—the kind you’d find on a Radeon HD 6970. But that product is already rated for up to 250 W maximum board power. Keeping the 6990’s thermal output manageable meant turning down the clocks from 880 MHz (Radeon HD 6970) to 830 MHz (Radeon HD 6990). AMD also uses a lower memory clock (1250 MHz rather than 1375 MHz). The resulting compute power adds up to 5.1 TFLOPS of single-precision math or 1.27 TFLOPS double-precision.

But AMD also arms this card with a couple of surprises that "break the rules" in the name of more muscle.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 193 comments.
This thread is closed for comments
  • -2 Hide
    hayest , March 8, 2011 3:34 AM
    Killer Card!

    Out of spec for default seems kind of weird though.
  • -2 Hide
    CrazeEAdrian , March 8, 2011 3:37 AM
    Great job AMD. You need to expect noise and heat when dealing with a card that beasts out that kind of performance, it's part of the territory.
  • 7 Hide
    jprahman , March 8, 2011 3:40 AM
    This thing is a monster, 375W TDP, 4GB of VRAM! Some people don't even have 4GB of regular RAM in their systems, let alone on their video card.
  • 1 Hide
    one-shot , March 8, 2011 3:43 AM
    Did I miss the load power draw? I just noticed the idle and noise ratings. It would be informative to see the power draw of Crossfire 6990s and overclocked i7. I see the graph, but a chart with CPU only and GPU only followed by a combination of both would be nice to see.
  • 0 Hide
    anacandor , March 8, 2011 3:44 AM
    For the people that actually buy this card, i'm sure they'll be able to afford an aftermarket cooler for this thing once they come out...
  • 0 Hide
    wino85 , March 8, 2011 3:46 AM
    OMG!!! It's finally here.
  • 0 Hide
    cangelini , March 8, 2011 3:48 AM
    one-shotDid I miss the load power draw? I just noticed the idle and noise ratings. It would be informative to see the power draw of Crossfire 6990s and overclocked i7. I see the graph, but a chart with CPU only and GPU only followed by a combination of both would be nice to see.


    We don't have two cards here to test, unfortunately. The logged load results for a single card are on the same page, though!
  • -1 Hide
    bombat1994 , March 8, 2011 3:52 AM
    things we need to see are this thing water cooled.

    and tested at 7680 x 1600

    that will see just how well it does.

    That thing is an absolute monster of a card.

    They really should have made it 32nm. then the power draw would have fallen below 300w and the thing would be cooler.

    STILL NICE WORK AMD
  • -1 Hide
    Bigmac80 , March 8, 2011 3:53 AM
    Pretty fast i wonder if this will be cheaper then 2 GTX 570's or 2 6950's?
    But omg this thing is freakin loud. What's the point of having a quite system now with Noctua fans :( 
  • 2 Hide
    tacoslave , March 8, 2011 3:54 AM
    Its hot, sucks alot of power, and costs a ton. But i still want one.








    Badly
  • 7 Hide
    lashton , March 8, 2011 3:54 AM
    AMD doesn't care about noise because they are waiting for custom colling solutions from OEM
  • -1 Hide
    lashton , March 8, 2011 3:56 AM
    bombat1994things we need to see are this thing water cooled.and tested at 7680 x 1600that will see just how well it does.That thing is an absolute monster of a card.They really should have made it 32nm. then the power draw would have fallen below 300w and the thing would be cooler.STILL NICE WORK AMD

    That maybe possible when they get 28nm ready on bulldozer, they are just raping the rewards of old tech.
  • 1 Hide
    scrumworks , March 8, 2011 4:01 AM
    Starts with negative comments (noise), so no surprises from Chris. Fermi of course never made so much noise and consume so much power that would require this type commenting. Everything was Power, PhysX and CUDA!
  • 0 Hide
    4745454b , March 8, 2011 4:03 AM
    Meh. To much for what it is. The only thing it does better then two 6970s is in power. (if one 6970 is 250W and this is 375W, then it uses less power then 2x6970.) But I agree that you're better off with a CF setup. Like the GTX480 and possibly the GTX580, its simply to much for what you pay for.

    Edit: I should say its either to much or to little and always in the wrong way for what you pay for. I also dislike the the 375W TDP. We have specs/rules for a reason.
  • 0 Hide
    Haserath , March 8, 2011 4:04 AM
    I think AMD will have the performance monster this round. It would be surprising if Nvidia was anticipating something like this. The GTX 570 already uses quite a few more Watts than the 6970; what will they do to match two 6970's on one board?
    This is isanity!
  • 0 Hide
    megamanx00 , March 8, 2011 4:14 AM
    OMFG!!!!

    Well, I guess if you were thinking of running 3, or even 6 displays (which would require at least one hub or daisy chain monitor), this is the card you would want, perhaps even two of them. I'm guessing if you put two of them in you really really want it water cooled.
  • 0 Hide
    MasterMace , March 8, 2011 4:14 AM
    Careful, you may not be able to here a tornado siren a mile away with this one.
  • -1 Hide
    nforce4max , March 8, 2011 4:17 AM
    Well at least it is cheaper than the two 7900gtx duos (eom) that landed some years after introduction. Personally I wouldn't purchase this card knowing the driver bugs and the usual issues that dual gpu cards have except I'll wait a few years to snatch one up on the cheap as a collectors item. For those who got the money wait at least two or three weeks for reviews and complaints by owners of this card before you buy one. I can live with the noise but bad drivers I can't.
  • 4 Hide
    cangelini , March 8, 2011 4:45 AM
    scrumworksStarts with negative comments (noise), so no surprises from Chris. Fermi of course never made so much noise and consume so much power that would require this type commenting. Everything was Power, PhysX and CUDA!


    LOL. Look at the power AND noise graphs, scrum :) 
  • 3 Hide
    dragonsqrrl , March 8, 2011 4:45 AM
    It looks like the improved CrossFireX scaling introduced with the HD6000 series really helps the HD6990 shine. There's no question about it, this thing's a top of the line performance beast.

    The big (and really inexcusable) problem is the noise, and to a lesser extent power consumption. It's by far the loudest single card stock cooler ever conceived, and that's taking into account the former champions, the GTX480 and HD5970. The load temps aren't great, but they're acceptable in my opinion. I'm not sure what people were expecting, but this is an extreme high-end dual GPU card, and load temps in the upper 80's C aren't uncommon in this performance segment. The problem is once again the excessive noise that's generated in order to keep the 2 GPU's running at those already high temps.

    I totally agree with the reviewer, the HD6990 seems rushed, the drivers are buggy, and if running a fan at 3k+ RPM is the only way to keep a card operating, it probably needs a little more tweaking before release.

    lashtonThat maybe possible when they get 28nm ready on bulldozer, they are just raping the rewards of old tech.

    Bulldozer will be manufactured using Global Foundries 32nm process, not 28. The node shrink you're referring to for next-gen GPU's will be manufactured using a completely different 28nm process at TSMC. AMD uses Golbal Foundries only for its CPU's at this time.
Display more comments