PCI Express & CrossFire: Scaling Explored

AMD’s CrossFire technology might be an excellent way to turn a mundane gaming machine into an FPS-shredding powerhouse, but with multiple Intel-based platforms supporting the feature across several different PCI Express lane configurations, not everyone is equally convinced. Putting aside the fact that many games only benefit from CrossFire at high resolutions and high-quality settings, many users are concerned about whether or not their motherboard can provide enough bandwidth to realize the full potential of CrossFire.

We hear questions like, “should I upgrade my motherboard first ?” and “CrossFire Upgrade or New Build” in our user forums, with a different set of answers each time. It’s time to put some data behind those responses.

Five Core 2 motherboards from ASUS, ready for testingFive Core 2 motherboards from ASUS, ready for testing

With Core 2 chipset development halted in light of Core i7 and recent worldwide financial events causing many buyers to re-evaluate their spending habits, now is the perfect time to analyze how CrossFire scales on various chipsets as a guide for those looking to enable the best possible performance at the right price.

Digging deep into our hardware stash, we found Core 2-compatible motherboards going all the way back to the venerable 975X chipset, along with every generation of Intel LGA775-based motherboard since.

Of course, we had to pick a starting point, so today’s article brings with it all the prior upgrades that owners of high-end systems up to two years old could reasonably be expected to have purchased, such as Intel’s fastest Core 2 Duo processor and four gigabytes of high-speed memory. Follow along as we detail each build and attempt to determine which motherboards are suitable for CrossFire upgrades and which are better retired to platform heaven.

This thread is closed for comments
96 comments
    Your comment
  • badge
    Thanks for laying that information out.
  • sparky2010
    should've included 1920x resolutions in the last page, as there are a lot of people out there with screens capable of that resolution.. but anyways, all in all a very good and informative article.. but i'm going to settle with a complete makeover when core i7 becomes more available!
  • V3NOM
    yer kinda interesting to see how things have changed with new mobos but it doesnt really have any practical value tbh.
  • Crashman
    V3NOMyer kinda interesting to see how things have changed with new mobos but it doesnt really have any practical value tbh.


    It's all about answering the question "Will a second card do the job".

    Lots of guys have midrange or better ATI graphics cards, and the question of "upgrade or replace" is constantly being asked.
  • outlw6669
    Thanks for finally getting this review out!
  • arkadi
    p45 looks grate, and the price is right.
  • arkadi
    btw x58 is out there, just a reminder.
  • outlw6669
    @ arkadi
    Yes the x58 is out.
    However, as it can not be paired with a Core 2 CPU and runs DDR3 exclusively, you can not directly compare the results.
    In general, I would assume crossfire on the x58 will scale similarly to the x38/48 as they both have the same PCIe configuration.
  • Crashman
    outlw6669Thanks for finally getting this review out!


    It was planned for September but kept getting delayed due to tight deadlines on other articles. But when the economy finally went from a slow decline to a nosedive in November, we knew this article had to come out right away. More people are putting new systems on hold and looking for ways to keep their old ones up to current performance standards, and we care about upgraders just as much as system builders.
  • arkadi
    Yeah I know, the comment was in general...
  • dimaf1985
    great article. consise and informative at the same time. now if only there was one for amd chipsets...
  • toms...i have loss so much respect for this website...stuppes
  • marraco
    Good work!.

    Altought, I have an Athlon X2 system, and probably gonna update to a I7 920. It would had be better comparing to an cheap i7 as a reference
  • Lurker87
    Excellent info. It'll be nice having this article to link to.
  • antiacid
    This article shows that even in the best conditions, x48 vs p45 is at most 5% difference. Price-wise, this confirms my observations that the lower priced P45 boards are much better performance/value than the x48 premium counterparts.
  • Roland00
    I understand it is more testing, and you already had several months of delays but it would have been nice to see 1920x1200 numbers. 24" monitors are now in the mainstream affordability range with prices ranging from $249 to $349
  • waffle911
    I might be missing something, but it kinda looks like a Phenom 9950 paired with the 790FX SB750 would be comparable to the X48. But really, what am I missing? I can't find a direct comparison anywhere.
  • waffle911
    Sorry: bit of an oversight on my part. CPU charts of course, though the AMD board is using the older SB600, but the performance difference shouldn't be much different.
  • Crashman
    Roland00I understand it is more testing, and you already had several months of delays but it would have been nice to see 1920x1200 numbers. 24" monitors are now in the mainstream affordability range with prices ranging from $249 to $349



    You're right! The problem is trying to test a whole bunch of different resolutions. 1920x1200 is almost right in the middle between 1680x1050 and 2560x1600, so hopefully most people can figure out "about" where that resolution would fall on the charts.

    Is it time to get rid of 1024x768? I'm in favor of ditching that resolution and picking a different one.
  • FlorinR
    I'm trying to figure out something after reading this article, maybe someone could help me understand??? It seems that a SINGLE Radeon HD 4870 still have enough bandwidth into a PCI-E 1.1 slot, and the differences in performance compared to PCI-E 2.0 came from the chipset (P35 vs. P45 in SINGLE card configuration). Am i wrong?
  • Crashman
    It appears that the cards work fine with PCIe 1.1 at x16 width. When you reduce the width to x8, PCIe 1.1 doesn't appear to have enough bandwidth for some games. When you drop the width to x4, things get much worse.
  • theblade
    Nice article, I would like to see one using SLI and NVIDIA chipsets.
  • Roland00
    Quote:
    You're right! The problem is trying to test a whole bunch of different resolutions. 1920x1200 is almost right in the middle between 1680x1050 and 2560x1600, so hopefully most people can figure out "about" where that resolution would fall on the charts. Is it time to get rid of 1024x768? I'm in favor of ditching that resolution and picking a different one.

    I know is far more work doing another resolution on top of the 3 you are already doing. You are doing 33% more testing and that will translate into doing dozens of more hour of work (and thus dozens of more hours delay on this article as well as others for an article that you were planning for September.) I understand and symptahize.

    It is just my belief that 24" monitors are becoming more of a "sweet spot" in the market due to prices going down due to technology and overproduction in a time when demand isn't so hot (I mean it is crazy you can find good 19 inch monitors for $99 to $129 now. And Frys had a 22inch Samsung for $179 last week). It is now possible to get a 1200p or 1080p monitor for the mid $200s to $300s when last year you were talking $500 to $600 for the same monitor.

    And yes I can guess where 1920x1200 will end up with, but the problem is we are seeing geometric growth after 1680x1050
    224% more pixels going from 1024x768 to 1680x1050.
    232% more pixels going from 1680x1050 to 2560x1600
    30% more pixels going from 1680x1050 to 1920x1200

    Yet we see this performance increase of adding a second crossfire card
    3% High 1024x768 p45
    19% High 1680x1050 p45
    93% High 2560x1600 p45

    Jumping from 19% to 93% is a big jump, yet you don't get that jump from 1024 to 1680.

    ----

    If it is only a marginal 25 to 30% jump in speeds it may be better just to save up on a next generation card instead of buying a second one. The 3850 to 4850 jump was much bigger than 30% as well as the 8600 to 9600. (Though the 8800gt to gtx260 wasn't that big of a jump)

    ----

    In full disclosure I don't even game at 1920x1200 instead I game at 1920x1080 using a HDTV as my monitor. Yet people like my younger brother is considering building a computer, and the decision to get a 4850 512 vs a 4850 1gb makes a difference for while they have small differences due to memory in normal games, there is a good difference in crossfire between the two due to the 1gb having twice the memory bandwidth and having to store twice the amount of data due to a non shared memory buffer.

    ----

    Thank you for the article though, I just had a small complaint but overall it was very helpful.
  • hannibal
    CrashmanYou're right! The problem is trying to test a whole bunch of different resolutions. 1920x1200 is almost right in the middle between 1680x1050 and 2560x1600, so hopefully most people can figure out "about" where that resolution would fall on the charts.Is it time to get rid of 1024x768? I'm in favor of ditching that resolution and picking a different one.


    Yep, it easy to see where the 1900x1200 results would be. Thank you very much for very usefull article!
    I am not so sure that dumping 1024x768 is a good idea. I by my self have not used that resolutions for years, but it's the standard vga resolution, and many gamers with 4:3 screens use it, when their system is not fast enough to run the game at higher settings.
    But because we are moving to widescreen 16:10 and now even 16:9, it will soon be the time to totally move to widescreen resolutions. Then it's even usefull to forget other 4:3 resolutions like 1600x1200 and smaller.
    The reason is that it's relative easy to extrapolate 4:3 results from widescreen results.
    I think that maybe even next year you can use only 16:10 or 16:9 results and give guidelines for those who can not calculate what "old" resolution is nearest to each widescreen results tested. The total picel amount is what counts anyway!