CrossFire Versus SLI Scaling: Does AMD's FX Actually Favor GeForce?

Is AMD Self-Loathing?

For years, we heard that ATI's graphics cards are more platform-dependent than Nvidia's and, depending on who had the fastest processor at the time, should really be used with that CPU. So, when AMD's highest-end processors started falling further and further behind Intel's quickest models, we weren't surprised when Nvidia started introducing AMD-compatible chipsets. Intel even forged a similar partnership with ATI, and we looked forward to the RD600 platform overshadowing Intel's own 975X as the premiere enthusiast chipset for Conroe-based processors. 

Many of us were confused when AMD decided to buy ATI rather than solidify its ties to Nvidia. Intel abandoned ATI's RD600 altogether and went off to develop X38 Express. Nvidia eventually dropped out of the PC chipset business entirely. But enthusiasts still took comfort in the notion that AMD’s acquisition might carry it through the rough times ahead. ATI was, after all, slightly more competitive.

Now that AMD and ATI are integrated (as well as two large companies can be after several years), we'd expect its CPU and GPU technologies to be extensively optimized for each other. Nevertheless, the suggestions continue that Radeon cards need more processing power behind them to achieve their performance potential. If that's true, the implication is that whenever one of our Intel-based platforms shows a Radeon and GeForce card performing similarly, an AMD-based system would actually show the GeForce performing better. Wait. What?

We began our tests with an evaluation of clock rate and its effect on CrossFire in FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire. Intel started out at a lower frequency and consequently had the most to gain. AMD couldn’t go very far beyond its stock clock rate without more exotic cooling, so it had the least to gain.

At the end of the day, both of our CPUs ended up at comparable clock rates with similarly-little effort, making that article a great head-to-head match. But that slight speed-up from AMD meant that a second GeForce-based article with the same CPU settings wouldn’t have given us very much new information. So, I decided to jump straight to the point: Does AMD’s flagship FX processor, overclocked, favor Nvidia graphics?

Thomas Soderstrom
Thomas Soderstrom is a Senior Staff Editor at Tom's Hardware US. He tests and reviews cases, cooling, memory and motherboards.
  • Arfisy Perdana
    what a pity for amd processor. So terrible
    Reply
  • Crashman
    BigMack70This article was a good laugh... I sincerely hope nobody is throwing $800+ worth of graphics muscle onto an FX series CPU.I think AMD is just trolling gamers with their CPUs. Although they're definitely catching up to Intel while Intel just sorta sits there and doesn't do anything after declaring ultimate victory with Sandy Bridge.The great thing about AMD is that its chipsets have a lot of PCIe lanes. That should make them great for multi-way graphics. The problem is, the more cards you add the worse the CPU looks. I can see someone doing 3-way SLI on a new 990FX board if they already had a few older/slower cards laying around.Arfisy Perdanawhat a pity for amd processor. So terribleNot terrible, AMD charges appropriately less for its slower CPU. It's no big deal, unless you're trying to push some high-end AMD graphics cards.
    Nobody remembers that at the time AMD bought ATI, they already had a business partnership with Nvidia on the co-marketing of 650A chipsets (AMD Business Platform) Also at the time AMD bought ATI, ATI already had a business partnership with Intel to develop the RD600 as a replacement for the 975X. AMD's purchase left both Nvidia and Intel stranded, as it took Intel more than a year to develop a replacement for the abandoned RD600.
    Reply
  • Crashman
    BigMack70Meh... they're PCI-e 2.0 lanes so they need twice as many to equal the PCI-e 3.0 lanes on Z77Depends on the cards you're using. 3-way at x8/x8/x4? Tom's Hardware did an article on how bad PCIe 2.0 x4 performed, so if you're carrying over a set of PCIe 2.0 cards from a previous system, well, I refer to the same comment that you referenced.
    Reply
  • campdude
    Interesting article
    Reply
  • Thanks for the article it was great.
    Amd is actually doing fine with their products especially with their GPUs.
    Why so much hate on their CPUs i will never understand.They are cheaper aren't they?
    Reply
  • esrever
    I am very confused by the article and the results.
    Reply
  • CaptainTom
    I'm going to be honest, this article didn't prove anything for these reasons:

    -The i7 is stronger so of course it scaled better.
    -The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.
    -The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
    Reply
  • smeezekitty
    CaptainTomI'm going to be honest, this article didn't prove anything for these reasons:-The i7 is stronger so of course it scaled better.-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.You hit the nail right on the head.
    Reply
  • ohyouknow
    CaptainTomI'm going to be honest, this article didn't prove anything for these reasons:-The i7 is stronger so of course it scaled better.-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
    Truth. Didn't really see anything other than the same games that show the FX falling behind did the same thing in this test as it would with anything involving Skyrim and so forth.
    Reply
  • billcat479
    I wish I could have seen a lets say better test. I'm not great at knowing which games to pick and one thing I would have liked to see is games that are able to use more than one core of the cpu.
    I know when the AMD came out with the new FX it's single threading was still not up to intel's standards but in many of the tests that used more core's the AMD could actually keep up with Intel's cpus. Not as good all the time but it's very very easy to make these AMD cpu's look bad, just run a single threaded and/or older game at them and presto.
    I would have liked to see how the video card would play into this, if AMD's running more optimized software would it's crossfire come out better as well as it's overall effect in games that make use of this.
    I mean how long have we had more than one cpu core running now?
    And I know it's taken the software people to come up to speed but the game board is changing and they are starting more and more to use more than one core so do you think this would be important to check out as well. And just maybe see some new data from how the AMD can use video cards if it's running software it was really designed for?
    We can play this Intel single thread line till hell freezes over and we all know there will not be any surprises as long as we do.
    And we also have seen a shift if low cost game setups start to favor AMD's older cpus because there is more software that can run on more core's? So lets start to even out the playing field a bit here ok?
    Reply