CrossFire Versus SLI Scaling: Does AMD's FX Actually Favor GeForce?

Is AMD Self-Loathing?

For years, we heard that ATI's graphics cards are more platform-dependent than Nvidia's and, depending on who had the fastest processor at the time, should really be used with that CPU. So, when AMD's highest-end processors started falling further and further behind Intel's quickest models, we weren't surprised when Nvidia started introducing AMD-compatible chipsets. Intel even forged a similar partnership with ATI, and we looked forward to the RD600 platform overshadowing Intel's own 975X as the premiere enthusiast chipset for Conroe-based processors. 

Many of us were confused when AMD decided to buy ATI rather than solidify its ties to Nvidia. Intel abandoned ATI's RD600 altogether and went off to develop X38 Express. Nvidia eventually dropped out of the PC chipset business entirely. But enthusiasts still took comfort in the notion that AMD’s acquisition might carry it through the rough times ahead. ATI was, after all, slightly more competitive.

We began our tests with an evaluation of clock rate and its effect on CrossFire in FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire. Intel started out at a lower frequency and consequently had the most to gain. AMD couldn’t go very far beyond its stock clock rate without more exotic cooling, so it had the least to gain.

At the end of the day, both of our CPUs ended up at comparable clock rates with similarly-little effort, making that article a great head-to-head match. But that slight speed-up from AMD meant that a second GeForce-based article with the same CPU settings wouldn’t have given us very much new information. So, I decided to jump straight to the point: Does AMD’s flagship FX processor, overclocked, favor Nvidia graphics?

Thomas Soderstrom is a Senior Staff Editor at Tom's Hardware US. He tests and reviews cases, cooling, memory and motherboards.