Sign in with
Sign up | Sign in

Benchmark Results: World Of Warcraft: Cataclysm (DX11)

The 990FX Chipset Arrives: AMD And SLI Rise Again
By

Our relationship with Blizzard’s World of Warcraft goes back a ways. Back in December of last year, I took a first look at the Cataclysm expansion pack with experimental support for DirectX 11 in World Of Warcraft: Cataclysm—Tom’s Performance Guide. Recently, that support was integrated into patch 4.1, under the game options menu. It's now official, and if you have a DX11-enabled card, I recommend you use it. If you need to ask why, check out the performance guide. Frame rate boosts can be quite surprising.

In our initial testing, we discovered that AMD’s processors were definitely limiting performance in this game. The average frame rate of the Phenom II X6 at 3.7 GHz was 60 frames per second.

Today we see that, with proper support for SLI in place, AMD’s platforms hit 75 frames per second or so. But it doesn’t matter if you run at 1680x1050 or 2560x1600, or if you use 1x multisampling or 8x. Simply, the frame rate doesn’t change. AMD’s processors are still the “problem,” for as much as 75 FPS can be considered problematic.

We’re really only concerned because Intel’s CPUs do so much better, exceeding 100 FPS at 1680x1050 and 1920x1080, only dipping under at 2560x1600 with 8x MSAA turned on.

Display all 90 comments.
This thread is closed for comments
Top Comments
  • 24 Hide
    stingstang , May 30, 2011 1:44 PM
    Tom's, what the hell is this? "At the end of the day, it's the graphics cards which are the bottleneck."
    Did you go about benchmarking graphics cards, or was this a motherboard/cpu comparison? I'm tired of hearing this excuse all the time. We know you have a pair of 6990s and 590s in your shop. Get rid of that stupid bottleneck and DO IT RIGHT!
  • 19 Hide
    saint19 , May 30, 2011 2:03 PM
    I'd keep in mind that this performance review was made it with an AM3 CPU and not with the new generation.
Other Comments
  • -3 Hide
    Marco925 , May 30, 2011 1:19 PM
    But! Does it play metro 2033?


    It Does!!!!
  • 0 Hide
    Anonymous , May 30, 2011 1:23 PM
    nice to see support for both videocard producers. especialy for nvidia. now you can do amd+nvidia not only amd+ati(amd)
  • 0 Hide
    nforce4max , May 30, 2011 1:24 PM
    So fast that it sucks your eyeballs into the back of your skull C:
  • 24 Hide
    stingstang , May 30, 2011 1:44 PM
    Tom's, what the hell is this? "At the end of the day, it's the graphics cards which are the bottleneck."
    Did you go about benchmarking graphics cards, or was this a motherboard/cpu comparison? I'm tired of hearing this excuse all the time. We know you have a pair of 6990s and 590s in your shop. Get rid of that stupid bottleneck and DO IT RIGHT!
  • 19 Hide
    saint19 , May 30, 2011 2:03 PM
    I'd keep in mind that this performance review was made it with an AM3 CPU and not with the new generation.
  • 8 Hide
    geekapproved , May 30, 2011 2:19 PM
    What performance review? They didn't get to test anything. LOL
  • -4 Hide
    Yuka , May 30, 2011 2:30 PM
    Thanks for the review, but at lower resolutions we all know that the CPU differences will become clear. So you just proved that if a game is taxing on the GPUs, both solutions are equal and when the graphics card ain't being taxed, CPU differences become apparent... Ok, thanks for proving what we already know once more (not being sarcastic here >_
  • 7 Hide
    Anonymous , May 30, 2011 2:35 PM
    Useless test without Bulldozers. We already know Phenoms suck against SB. Hopefully Zambezi will be able to compete with SB in games, otherwise I'm ditching them, after 6 years of AMD CPUs in my rig.
  • 1 Hide
    Sud099 , May 30, 2011 2:55 PM
    We all know that SB CPU >> Phenom 2 series.Then how one can compare performance of two platforms while the Sb CPU performance is superior to Phenom2 series...
  • 6 Hide
    nukemaster , May 30, 2011 3:31 PM
    tommyschIs there any other brand?

    S3? :p 
  • 5 Hide
    amk09 , May 30, 2011 3:43 PM
    stingstangTom's, what the hell is this? "At the end of the day, it's the graphics cards which are the bottleneck." Did you go about benchmarking graphics cards, or was this a motherboard/cpu comparison? I'm tired of hearing this excuse all the time. We know you have a pair of 6990s and 590s in your shop. Get rid of that stupid bottleneck and DO IT RIGHT!


    I'm quite satisfied with this review. Nobody in their right mind is going to have dual 6990's or 590's and use a phenom II x4 or a i5 2400.

    Although the point you made is absolutely correct, it wouldn't be a very logical review.
  • 2 Hide
    Onus , May 30, 2011 3:57 PM
    Disappointing. I can clearly see the value of features like Virtu and SRT on the Z68 platform, but the 990FX doesn't offer anything comparable. Well, I've waited this long, another few weeks won't kill me to see if Fusion makes a difference...
    ...Except that July is the month I expect the parasites' efforts to destroy the value of the dollar will start coming to their fruition.
  • 3 Hide
    cangelini , May 30, 2011 4:18 PM
    stingstangTom's, what the hell is this? "At the end of the day, it's the graphics cards which are the bottleneck." Did you go about benchmarking graphics cards, or was this a motherboard/cpu comparison? I'm tired of hearing this excuse all the time. We know you have a pair of 6990s and 590s in your shop. Get rid of that stupid bottleneck and DO IT RIGHT!


    I don't consider that doing it right. Nobody in their right mind buys an AMD CPU for $180 bucks and then pairs it with two $700 graphics cards. GTX 570s is a realistic choice.
  • 1 Hide
    bit_user , May 30, 2011 5:38 PM
    Does it really use the same silicon as the 890FX? In that case, what took them so long?!? The whole point of backwards compatibility is lost if you launch on the eve of a new CPU release.

    I was actually just about to buy a 890FX board + Phenom II X4, last year, with plans to upgrade to bulldozer in late 2011. But then came the announcements of incompatibilities and the he-said/she-said rumors of possible compatibility and I just decided to play it safe and wait.

    Well, AMD lost my business, on this one. They could have at least sold me a Phenom X4. While I've been waiting, I've even been looking at the Sandy Bridge Xeons, which also support ECC and are more competitively-priced than previous generations.

    Nice going, guys.
  • 3 Hide
    wh3resmycar , May 30, 2011 5:45 PM
    i beg to differ. how can 2 gtx570s be a bottleneck in a real world situation? it's not like the world would suddenly turn upside down if they used a 580 instead.

    and no that's not what everybody wants at least with this 990fx "Preview".

    hopefully chris would do a follow up on this article once the dozers comes out.
  • -8 Hide
    Anonymous , May 30, 2011 6:23 PM
    This is wrong! It should be about 990FX not i5/980. I wanted to know how 990FXA does against 890FXA, not i5 vs Phenom II. That we already know. And why did you include only gaming benchmark? You were afraid that intel might look bad in multimedia benchmarks. Intel a**-kissers!
Display more comments