Sign in with
Sign up | Sign in
CrossFire Versus SLI Scaling: Does AMD's FX Actually Favor GeForce?
By ,
1. Is AMD Self-Loathing?

For years, we heard that ATI's graphics cards are more platform-dependent than Nvidia's and, depending on who had the fastest processor at the time, should really be used with that CPU. So, when AMD's highest-end processors started falling further and further behind Intel's quickest models, we weren't surprised when Nvidia started introducing AMD-compatible chipsets. Intel even forged a similar partnership with ATI, and we looked forward to the RD600 platform overshadowing Intel's own 975X as the premiere enthusiast chipset for Conroe-based processors. 

Many of us were confused when AMD decided to buy ATI rather than solidify its ties to Nvidia. Intel abandoned ATI's RD600 altogether and went off to develop X38 Express. Nvidia eventually dropped out of the PC chipset business entirely. But enthusiasts still took comfort in the notion that AMD’s acquisition might carry it through the rough times ahead. ATI was, after all, slightly more competitive.

Now that AMD and ATI are integrated (as well as two large companies can be after several years), we'd expect its CPU and GPU technologies to be extensively optimized for each other. Nevertheless, the suggestions continue that Radeon cards need more processing power behind them to achieve their performance potential. If that's true, the implication is that whenever one of our Intel-based platforms shows a Radeon and GeForce card performing similarly, an AMD-based system would actually show the GeForce performing better. Wait. What?

We began our tests with an evaluation of clock rate and its effect on CrossFire in FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire. Intel started out at a lower frequency and consequently had the most to gain. AMD couldn’t go very far beyond its stock clock rate without more exotic cooling, so it had the least to gain.

At the end of the day, both of our CPUs ended up at comparable clock rates with similarly-little effort, making that article a great head-to-head match. But that slight speed-up from AMD meant that a second GeForce-based article with the same CPU settings wouldn’t have given us very much new information. So, I decided to jump straight to the point: Does AMD’s flagship FX processor, overclocked, favor Nvidia graphics?

2. Test Settings And Benchmarks
Test System Configuration
Intel CPUIntel Core i7-3770K (Ivy Bridge): 3.5 GHz, 8 MB Shared L3 Cache, LGA 1155
Overclocked to 4.4 GHz at 1.25 V
Intel MotherboardAsus Sabertooth Z77, BIOS 1504 (08/03/2012)
Intel CPU CoolerThermalright MUX-120 w/Zalman ZM-STG1 Paste
AMD CPUAMD FX-8350 (Vishera): 4.0 GHz, 8 MB Shared L3 Cache, Socket AM3+
Overclocked to 4.4 GHz at 1.35 V
AMD MotherboardAsus Sabertooth 990FX, BIOS 1604 (10/24/2012)
AMD CPU CoolerSunbeamtech Core-Contact Freezer w/Zalman ZM-STG1 Paste
RAMG.Skill F3-17600CL9Q-16GBXLD (16 GB)
DDR3-2200 CAS 9-11-9-36 1.65 V
AMD Graphics2 x MSI R7970-2PMD3GD5/OC: 1,010 MHz GPU, GDDR5-5500
Nvidia Graphics2 x Gigabyte GV-N680OC-4GD: 1,137 MHz GPU, GDDR5-6008
Hard DriveMushkin Chronos Deluxe DX 240 GB, SATA 6Gb/s SSD
SoundIntegrated HD Audio
NetworkIntegrated Gigabit Networking
PowerSeasonic X760 SS-760KM: ATX12V v2.3, EPS12V, 80 PLUS Gold
Software
OSMicrosoft Windows 8 Professional RTM x64
AMD GraphicsAMD Catalyst 12.10
Nvidia GraphicsNvidia GeForce 310.90


Great performance and quick installation keep Thermalright’s MUX-120 and Sunbeamtech’s Core Contact Freezer in my inventory of favorite testing components. The brackets that come with these older samples make them non-interchangeable, however.

G.Skill’s F3-17600CL9Q-16GBXLD has a remarkable DDR3-2200 CAS 9 rating, using Intel XMP technology for semi-automatic configuration. As a non-Intel platform, the Sabertooth 990FX configures XMP values through Asus' DOCP setting.

Seasonic’s X760 provides the consistent efficiency required to assess platform power differences.

Keeping the benchmark set from our previous round cut back testing time, though it also meant utilizing older drivers. The thing to remember is that we aren't trying to compare the performance of AMD's and Nvidia's graphics cards, and we're breaking each GPU vendor into separate charts to prevent this. Rather, we're interested in how each configuration behaves attached to AMD- and Intel-based platforms.

3D Game Benchmarks
Aliens vs PredatorUsing AvP Tool v 1.03, SSAO/Tesselation/Shadows On
Test Set 1: High Textures, No AA, 4x AF
Test Set 2: Very High Textures, 4x AA, 16x AF
Battlefield 3Campaign Mode, "Going Hunting" 90-Second Fraps
Test Set 1: Medium Quality Defaults (No AA, 4x AF)
Test Set 2: Ultra Quality Defaults (4x AA, 16x AF)
F1 2012Steam version, In-game benchmark
Test Set 1: High Quality Preset, No AA
Test Set 2: Ultra Quality Preset, 8x AA
Elder Scrolls V: SkyrimUpdate 1.7, Celedon Aethirborn Level 6, 25-Second Fraps
Test Set 1: DX11, High Details No AA, 8x AF, FXAA enabled
Test Set 2: DX11, Ultra Details, 8x AA, 16x AF, FXAA enabled
Metro 2033Full Game, Built-In Benchmark, "Frontline" Scene
Test Set 1: DX11, High, AAA, 4x AF, No PhysX, No DoF
Test Set 2: DX11, Very High, 4x AA, 16x AF, No PhysX, DoF On
3. Results: Aliens Vs. Predator

Again, we're making every attempt not to create a Radeon or GeForce graphics card review; the focus of this article is more specific. In order to address murmurs about AMD's platform dependencies, I want to know how each vendor's hardware scales on AMD’s flagship CPU. Two charts represent two different graphics sources, and ideal scaling would show two charts that look identical, apart from their actual average frame rates.

Our less-demanding Aliens vs. Predator results demonstrate the ideal CPU-to-GPU scaling scenario. Even though AMD's Radeon HD 7970s are faster, the charts are a near-perfect match, suggesting similar scaling.

We again see consistency between Radeon and GeForce results. AvP prefers the Radeon cards, but when we ignore absolute frame rates, the bars appear nearly identical. If this remains true through all of our testing, we can fairly confidently "bust the myth" that Radeon cards require higher-performance platforms to reach their potential, and that AMD's FX CPUs bottleneck Nvidia's graphics products less.

4. Results: Battlefield 3

The Radeon graphics cards encounter a weird performance penalty at 1920x1080, regardless of the CrossFire mode or processor used. The consistency we're looking for doesn't come from the first and second charts, but rather from comparing the top half to the bottom half of each data set. Doing so tells us that the FX-8350’s small performance deficit doesn't come from the graphics cards we're using, but rather the processor's performance.

Incidentally, comparisons to the results from our most recent System Builder Marathon reveal that newer drivers ameliorate the anomaly at 1920x1080.

5. Results: F1 2012

We’ve already seen that F1 2012 is bottlenecked by memory throughput, so we’re not surprised to find AMD’s less effective memory controller holding back average frame rates at the low to medium settings that emphasize platform (rather than GPU) performance.

The only consistency we find appears to be a bottleneck outside of the graphics system, so we looked closer at the differences between single- and multi-GPU configurations.

Adding a second card to the FX-8350 platform actually drops F1 2012’s performance at its High Quality preset, with CrossFire imparting a bigger penalty than SLI. This game's Ultra detail setting is necessary to push the graphics workload beyond CPU and memory bottlenecks.

The FX-8350 appears to be a big limitation to Radeon HD 7970 performance. What happens when we switch to GeForce?

Looking exclusively at our 5760x1080 results, a single Radeon HD 7970 appears far more powerful than one GeForce GTX 680 in this game. We also see that the same GeForce GTX 680 appears far less CPU-bound than a single Radeon HD 7970 when we match either card up to AMD's FX-8350 processor. Unintended (and not ideal) though it might be, AMD's CPUs might really be allowing Nvidia's graphics cards to come closer to their potential than its own.

The FX-8350 appears to completely bind up CrossFire performance, while allowing SLI a little more space to stretch its legs at higher resolutions. This is again evidence of a platform-oriented issue, at least in one game thus far.

6. Results: The Elder Scrolls V: Skyrim

The FX-8350 appears to hamper the frame rates of CrossFire and SLI configurations alike in Skyrim. More taxing detail levels have little effect on the shape of the chart, though the numbers assigned to those colored bars are a little lower.

7. Results: Metro 2033

CrossFire performance scaling is tighter than SLI scaling in Metro 2033. Although this applies to both Intel and AMD CPUs, only the FX-8350 is able to pull down a pair of Radeon HD 7970s to the frame rates a single Radeon HD 7970 achieves. That’s a shame, since the company’s Tahiti graphics processor appears more powerful than the GeForce GTX 680's GK104.

8. Frame Rate-Over-Time Analysis

Our first article included a frame rate-over-time analysis intended to identify problematic sequences in our testing. Regardless of whether you're using one card or multiple GPUs, dramatic slow-downs interrupt game play. Unfortunately, although Nvidia enabled Don (up in Canada) and Chris (down in Southern California) with its FCAT tools, I'm only able to use the Fraps-based testing I ran previously. We know from our comparisons in Challenging FPS: Testing SLI And CrossFire Using Video Capture that Fraps isn't able to accurately capture the dropped and runt frames that might plague one graphics solution but not the other. However, we are at least able to track when each combination of cards drops to levels we deem unplayable.

Our least-demanding Battlefield 3 settings should reveal CPU bottlenecks, and indeed we see that three out of four multi-GPU configurations turn in similar results. Only Intel's Core i7 is powerful enough to butt up against the game’s 200 FPS cap, and only when it’s paired with two GeForce GTX 680s in SLI.

Our most taxing settings should demonstrate GPU limits. Unfortunately, those limits appear fairly consistent for all four dual-GPU configurations. At least the test runs smoothly, staying well above 50 FPS throughout its duration.

An attempt to demonstrate CPU-bound conditions in Skyrim is somewhat successful, with noticeable separation between the Intel and AMD CPU results. Both processors favor SLI over CrossFire in this title.

And now for our most taxing detail settings in Skyrim, where graphics performance matters most. Two Radeon HD 7970s in CrossFire beat the GeForce GTX 680s in SLI on the Intel-based system, while the reverse holds true for AMD's FX-8350. Doh!

9. Results: 3DMark 11

We've seen 3DMark favor Nvidia's graphics cards and Intel's CPUs. So, we're adding these numbers for the folks who track results across multiple sites, but we're not using them in our overall performance evaluation.

3DMark's Extreme detail preset is constrained a little more by graphics performance, reducing Intel’s margin of victory.

10. Power And Efficiency

Despite the marketing behind ZeroCore, and indeed, the technology suite's effectiveness in single- and multi-card configurations, Radeon HD 7970s cannot idle with three monitors attached. The host processor's power use isn't bad, though. This is one of those instances where putting the AMD and Nvidia cards into separate charts make sense, since we're trying to compare CPU-to-GPU pairing, rather then CPUs or GPUs alone.

Regardless of whether you're running under an AMD or Intel processor, adding a second Radeon HD 7970 appears to impart far greater power consumption than a second GeForce GTX 680. Single-GPU load power is comparable between the competing graphics cards.

Those power draw differences are reflected in reduced efficiency. Comparing performance to power, GeForce efficiency appears to increase in SLI, while Radeon efficiency appears to drop in CrossFire. None of this gets us closer to figuring out whether AMD’s fastest CPUs allow Nvidia's graphics hardware to reach further than its own, however.

11. CPU-To-GPU Performance Scaling

Much to the chagrin of my boss, I followed up a week of benchmarking with a few days of wading through data, trying to figure out the best way to present these results. We can, for example, consider the cards first.

We clearly see that the Core i7-3770K gets a bigger boost from two Radeon HD 7970s in CrossFire than AMD's FX-8350, but that could just be a general indication of lower CPU performance from the Vishera-based CPU. That is fair enough. The FX is a less expensive processor, so we're completely fine with it not performing as well.

SLI scales a little better than CrossFire on the Intel CPU, and the SLI-versus-CrossFire spread is even wider on the FX-8350-based configuration. We’re getting close to an answer.

Using the Core i7-3770K as a common factor, we find that AMD's Radeon HD 7970 slightly outperforms the GeForce GTX 680 in our test suite. Yet, slightly better scaling allows SLI to catch up to CrossFire.

Conversely, the FX-8350 appears to favor Nvidia's SLI technology over CrossFire. This is just another data point quantifying the potential validity of claims that AMD's cards are more limited on its own processors, and that Nvidia's graphics hardware is able to extract more performance from a top-end FX.

Intel takes an 11% lead over AMD when paired to a single Radeon HD 7970. That lead shrinks to only 9% when a single GeForce GTX 680 is used. Of course, this chart doesn’t make clear whether AMD is favoring the GeForce, or Intel is instead favoring the Radeon.

CrossFire boosts gaming performance on Intel's Core i7-3770K by 72%, but only manages to speed up frame rates on AMD's FX-8350 by 47%. The Vishera-based FX gets a far bigger kick from SLI.

12. How Does FX Treat Your Graphics Card?

Does AMD’s flagship CPU really allow GeForce cards to run closer to their potential than Radeons? That conclusion is still a little difficult to reach, since the Radeon HD 7970 is faster than the GeForce GTX 680 in today's benchmark suite. We can still get there by comparing the results of a less-bottlenecked processor, however, and we can get even more information by comparing CrossFire to SLI. After spending way too much time poring over the data, the best illustration of CPU-to-GPU scaling occurred as I wrote this article’s introduction.

The above chart shows that, without significant CPU bottlenecks, two GeForce GTX 680s in SLI and a pair of Radeon HD 7970s in CrossFire offer similar performance. It also illustrates that this performance similarly comes not from having identically-performing cards, but from a slight scaling advantage favoring the slower GeForce GTX 680s when you put two of them together.

Our big revelation is that AMD’s FX-8350 performs 5% better in SLI than it does in CrossFire. When all else is made equal, AMD’s current flagship host processor really does favor Nvidia's graphics technology. Whoops.