


Our relationship with Blizzard’s World of Warcraft goes back a ways. Back in December of last year, I took a first look at the Cataclysm expansion pack with experimental support for DirectX 11 in World Of Warcraft: Cataclysm—Tom’s Performance Guide. Recently, that support was integrated into patch 4.1, under the game options menu. It's now official, and if you have a DX11-enabled card, I recommend you use it. If you need to ask why, check out the performance guide. Frame rate boosts can be quite surprising.
In our initial testing, we discovered that AMD’s processors were definitely limiting performance in this game. The average frame rate of the Phenom II X6 at 3.7 GHz was 60 frames per second.
Today we see that, with proper support for SLI in place, AMD’s platforms hit 75 frames per second or so. But it doesn’t matter if you run at 1680x1050 or 2560x1600, or if you use 1x multisampling or 8x. Simply, the frame rate doesn’t change. AMD’s processors are still the “problem,” for as much as 75 FPS can be considered problematic.
We’re really only concerned because Intel’s CPUs do so much better, exceeding 100 FPS at 1680x1050 and 1920x1080, only dipping under at 2560x1600 with 8x MSAA turned on.
- 990FX: Socket AM3+ Meets SLI
- 990FX Boards From Asus And MSI
- Hardware And Benchmark Setup
- Benchmark Results: 3DMark 11
- Benchmark Results: Metro 2033 (DX11)
- Benchmark Results: Lost Planet 2 (DX11)
- Benchmark Results: Aliens Vs. Predator (DX11)
- Benchmark Results: Battlefield: Bad Company 2 (DX11)
- Benchmark Results: F1 2010 (DX11)
- Benchmark Results: Just Cause 2 (DX11)
- Benchmark Results: World Of Warcraft: Cataclysm (DX11)
- Conclusion
Did you go about benchmarking graphics cards, or was this a motherboard/cpu comparison? I'm tired of hearing this excuse all the time. We know you have a pair of 6990s and 590s in your shop. Get rid of that stupid bottleneck and DO IT RIGHT!
It Does!!!!
Did you go about benchmarking graphics cards, or was this a motherboard/cpu comparison? I'm tired of hearing this excuse all the time. We know you have a pair of 6990s and 590s in your shop. Get rid of that stupid bottleneck and DO IT RIGHT!
What is missing said something like:
...here "face"), but you said you wanted to test AMD's SLI on their 990FX vs Intel's SLI. So, IMO, you need less graphics horse power: like 2 GTS250's or 2 GTX460's or 2 GTX560's (not ti's) to tax the graphics subsystem and really show the differences. Maybe up the resolution also to really show if there is a difference between AMD's or Intel's SLI.
Thanks again for the Article, Mr Chris.
Cheers!
Is there any other brand?
S3?
I'm quite satisfied with this review. Nobody in their right mind is going to have dual 6990's or 590's and use a phenom II x4 or a i5 2400.
Although the point you made is absolutely correct, it wouldn't be a very logical review.
...Except that July is the month I expect the parasites' efforts to destroy the value of the dollar will start coming to their fruition.
I don't consider that doing it right. Nobody in their right mind buys an AMD CPU for $180 bucks and then pairs it with two $700 graphics cards. GTX 570s is a realistic choice.
I was actually just about to buy a 890FX board + Phenom II X4, last year, with plans to upgrade to bulldozer in late 2011. But then came the announcements of incompatibilities and the he-said/she-said rumors of possible compatibility and I just decided to play it safe and wait.
Well, AMD lost my business, on this one. They could have at least sold me a Phenom X4. While I've been waiting, I've even been looking at the Sandy Bridge Xeons, which also support ECC and are more competitively-priced than previous generations.
Nice going, guys.
and no that's not what everybody wants at least with this 990fx "Preview".
hopefully chris would do a follow up on this article once the dozers comes out.