Skip to main content

CrossFire Versus SLI Scaling: Does AMD's FX Actually Favor GeForce?

Test Settings And Benchmarks

Test System Configuration
Intel CPUIntel Core i7-3770K (Ivy Bridge): 3.5 GHz, 8 MB Shared L3 Cache, LGA 1155 Overclocked to 4.4 GHz at 1.25 V
Intel MotherboardAsus Sabertooth Z77, BIOS 1504 (08/03/2012)
Intel CPU CoolerThermalright MUX-120 w/Zalman ZM-STG1 Paste
AMD CPUAMD FX-8350 (Vishera): 4.0 GHz, 8 MB Shared L3 Cache, Socket AM3+ Overclocked to 4.4 GHz at 1.35 V
AMD MotherboardAsus Sabertooth 990FX, BIOS 1604 (10/24/2012)
AMD CPU CoolerSunbeamtech Core-Contact Freezer w/Zalman ZM-STG1 Paste
RAMG.Skill F3-17600CL9Q-16GBXLD (16 GB) DDR3-2200 CAS 9-11-9-36 1.65 V
AMD Graphics2 x MSI R7970-2PMD3GD5/OC: 1,010 MHz GPU, GDDR5-5500
Nvidia Graphics2 x Gigabyte GV-N680OC-4GD: 1,137 MHz GPU, GDDR5-6008
Hard DriveMushkin Chronos Deluxe DX 240 GB, SATA 6Gb/s SSD
SoundIntegrated HD Audio
NetworkIntegrated Gigabit Networking
PowerSeasonic X760 SS-760KM: ATX12V v2.3, EPS12V, 80 PLUS Gold
Software
OSMicrosoft Windows 8 Professional RTM x64
AMD GraphicsAMD Catalyst 12.10
Nvidia GraphicsNvidia GeForce 310.90

Great performance and quick installation keep Thermalright’s MUX-120 and Sunbeamtech’s Core Contact Freezer in my inventory of favorite testing components. The brackets that come with these older samples make them non-interchangeable, however.

Image 1 of 2

Image 2 of 2

G.Skill’s F3-17600CL9Q-16GBXLD has a remarkable DDR3-2200 CAS 9 rating, using Intel XMP technology for semi-automatic configuration. As a non-Intel platform, the Sabertooth 990FX configures XMP values through Asus' DOCP setting.

Seasonic’s X760 provides the consistent efficiency required to assess platform power differences.

Keeping the benchmark set from our previous round cut back testing time, though it also meant utilizing older drivers. The thing to remember is that we aren't trying to compare the performance of AMD's and Nvidia's graphics cards, and we're breaking each GPU vendor into separate charts to prevent this. Rather, we're interested in how each configuration behaves attached to AMD- and Intel-based platforms.

3D Game Benchmarks
Aliens vs PredatorUsing AvP Tool v 1.03, SSAO/Tesselation/Shadows On Test Set 1: High Textures, No AA, 4x AF Test Set 2: Very High Textures, 4x AA, 16x AF
Battlefield 3Campaign Mode, "Going Hunting" 90-Second Fraps Test Set 1: Medium Quality Defaults (No AA, 4x AF) Test Set 2: Ultra Quality Defaults (4x AA, 16x AF)
F1 2012Steam version, In-game benchmark Test Set 1: High Quality Preset, No AA Test Set 2: Ultra Quality Preset, 8x AA
Elder Scrolls V: SkyrimUpdate 1.7, Celedon Aethirborn Level 6, 25-Second Fraps Test Set 1: DX11, High Details No AA, 8x AF, FXAA enabled Test Set 2: DX11, Ultra Details, 8x AA, 16x AF, FXAA enabled
Metro 2033Full Game, Built-In Benchmark, "Frontline" Scene Test Set 1: DX11, High, AAA, 4x AF, No PhysX, No DoF Test Set 2: DX11, Very High, 4x AA, 16x AF, No PhysX, DoF On
  • Arfisy Perdana
    what a pity for amd processor. So terrible
    Reply
  • Crashman
    BigMack70This article was a good laugh... I sincerely hope nobody is throwing $800+ worth of graphics muscle onto an FX series CPU.I think AMD is just trolling gamers with their CPUs. Although they're definitely catching up to Intel while Intel just sorta sits there and doesn't do anything after declaring ultimate victory with Sandy Bridge.The great thing about AMD is that its chipsets have a lot of PCIe lanes. That should make them great for multi-way graphics. The problem is, the more cards you add the worse the CPU looks. I can see someone doing 3-way SLI on a new 990FX board if they already had a few older/slower cards laying around.Arfisy Perdanawhat a pity for amd processor. So terribleNot terrible, AMD charges appropriately less for its slower CPU. It's no big deal, unless you're trying to push some high-end AMD graphics cards.
    Nobody remembers that at the time AMD bought ATI, they already had a business partnership with Nvidia on the co-marketing of 650A chipsets (AMD Business Platform) Also at the time AMD bought ATI, ATI already had a business partnership with Intel to develop the RD600 as a replacement for the 975X. AMD's purchase left both Nvidia and Intel stranded, as it took Intel more than a year to develop a replacement for the abandoned RD600.
    Reply
  • Crashman
    BigMack70Meh... they're PCI-e 2.0 lanes so they need twice as many to equal the PCI-e 3.0 lanes on Z77Depends on the cards you're using. 3-way at x8/x8/x4? Tom's Hardware did an article on how bad PCIe 2.0 x4 performed, so if you're carrying over a set of PCIe 2.0 cards from a previous system, well, I refer to the same comment that you referenced.
    Reply
  • campdude
    Interesting article
    Reply
  • Thanks for the article it was great.
    Amd is actually doing fine with their products especially with their GPUs.
    Why so much hate on their CPUs i will never understand.They are cheaper aren't they?
    Reply
  • esrever
    I am very confused by the article and the results.
    Reply
  • CaptainTom
    I'm going to be honest, this article didn't prove anything for these reasons:

    -The i7 is stronger so of course it scaled better.
    -The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.
    -The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
    Reply
  • smeezekitty
    CaptainTomI'm going to be honest, this article didn't prove anything for these reasons:-The i7 is stronger so of course it scaled better.-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.You hit the nail right on the head.
    Reply
  • ohyouknow
    CaptainTomI'm going to be honest, this article didn't prove anything for these reasons:-The i7 is stronger so of course it scaled better.-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
    Truth. Didn't really see anything other than the same games that show the FX falling behind did the same thing in this test as it would with anything involving Skyrim and so forth.
    Reply
  • billcat479
    I wish I could have seen a lets say better test. I'm not great at knowing which games to pick and one thing I would have liked to see is games that are able to use more than one core of the cpu.
    I know when the AMD came out with the new FX it's single threading was still not up to intel's standards but in many of the tests that used more core's the AMD could actually keep up with Intel's cpus. Not as good all the time but it's very very easy to make these AMD cpu's look bad, just run a single threaded and/or older game at them and presto.
    I would have liked to see how the video card would play into this, if AMD's running more optimized software would it's crossfire come out better as well as it's overall effect in games that make use of this.
    I mean how long have we had more than one cpu core running now?
    And I know it's taken the software people to come up to speed but the game board is changing and they are starting more and more to use more than one core so do you think this would be important to check out as well. And just maybe see some new data from how the AMD can use video cards if it's running software it was really designed for?
    We can play this Intel single thread line till hell freezes over and we all know there will not be any surprises as long as we do.
    And we also have seen a shift if low cost game setups start to favor AMD's older cpus because there is more software that can run on more core's? So lets start to even out the playing field a bit here ok?
    Reply