Sign in with
Sign up | Sign in

Results: Metro 2033

CrossFire Versus SLI Scaling: Does AMD's FX Actually Favor GeForce?
By

We've heard it said before that AMD's GPUs are more platform-dependent than Nvidia's. So, what happens when you drop a Radeon and a GeForce into an FX-8350-based system? Does AMD's CPU get in the way of its GPU running as well as it possibly could?

CrossFire performance scaling is tighter than SLI scaling in Metro 2033. Although this applies to both Intel and AMD CPUs, only the FX-8350 is able to pull down a pair of Radeon HD 7970s to the frame rates a single Radeon HD 7970 achieves. That’s a shame, since the company’s Tahiti graphics processor appears more powerful than the GeForce GTX 680's GK104.

Display all 118 comments.
This thread is closed for comments
Top Comments
  • 31 Hide
    kounelos , April 10, 2013 10:36 PM
    Thanks for the article it was great.
    Amd is actually doing fine with their products especially with their GPUs.
    Why so much hate on their CPUs i will never understand.They are cheaper aren't they?
  • 18 Hide
    abbadon_34 , April 10, 2013 11:50 PM
    I'm curious why 12 month older drivers were used? Were the drivers considered irrelevent so old tests could be recycled ?
  • 18 Hide
    CaptainTom , April 10, 2013 10:49 PM
    I'm going to be honest, this article didn't prove anything for these reasons:

    -The i7 is stronger so of course it scaled better.
    -The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.
    -The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
Other Comments
  • -8 Hide
    Arfisy Perdana , April 10, 2013 9:28 PM
    what a pity for amd processor. So terrible
  • 13 Hide
    Crashman , April 10, 2013 9:29 PM
    BigMack70This article was a good laugh... I sincerely hope nobody is throwing $800+ worth of graphics muscle onto an FX series CPU.I think AMD is just trolling gamers with their CPUs. Although they're definitely catching up to Intel while Intel just sorta sits there and doesn't do anything after declaring ultimate victory with Sandy Bridge.
    The great thing about AMD is that its chipsets have a lot of PCIe lanes. That should make them great for multi-way graphics. The problem is, the more cards you add the worse the CPU looks. I can see someone doing 3-way SLI on a new 990FX board if they already had a few older/slower cards laying around.
    Arfisy Perdanawhat a pity for amd processor. So terrible
    Not terrible, AMD charges appropriately less for its slower CPU. It's no big deal, unless you're trying to push some high-end AMD graphics cards.
    Nobody remembers that at the time AMD bought ATI, they already had a business partnership with Nvidia on the co-marketing of 650A chipsets (AMD Business Platform) Also at the time AMD bought ATI, ATI already had a business partnership with Intel to develop the RD600 as a replacement for the 975X. AMD's purchase left both Nvidia and Intel stranded, as it took Intel more than a year to develop a replacement for the abandoned RD600.
  • 9 Hide
    Crashman , April 10, 2013 9:36 PM
    BigMack70Meh... they're PCI-e 2.0 lanes so they need twice as many to equal the PCI-e 3.0 lanes on Z77

    Depends on the cards you're using. 3-way at x8/x8/x4? Tom's Hardware did an article on how bad PCIe 2.0 x4 performed, so if you're carrying over a set of PCIe 2.0 cards from a previous system, well, I refer to the same comment that you referenced.
  • 0 Hide
    campdude , April 10, 2013 10:11 PM
    Interesting article
  • 31 Hide
    kounelos , April 10, 2013 10:36 PM
    Thanks for the article it was great.
    Amd is actually doing fine with their products especially with their GPUs.
    Why so much hate on their CPUs i will never understand.They are cheaper aren't they?
  • -3 Hide
    esrever , April 10, 2013 10:36 PM
    I am very confused by the article and the results.
  • 18 Hide
    CaptainTom , April 10, 2013 10:49 PM
    I'm going to be honest, this article didn't prove anything for these reasons:

    -The i7 is stronger so of course it scaled better.
    -The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.
    -The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
  • 9 Hide
    smeezekitty , April 10, 2013 10:59 PM
    CaptainTomI'm going to be honest, this article didn't prove anything for these reasons:-The i7 is stronger so of course it scaled better.-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.

    You hit the nail right on the head.
  • -5 Hide
    ohyouknow , April 10, 2013 11:11 PM
    CaptainTomI'm going to be honest, this article didn't prove anything for these reasons:-The i7 is stronger so of course it scaled better.-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.


    Truth. Didn't really see anything other than the same games that show the FX falling behind did the same thing in this test as it would with anything involving Skyrim and so forth.
  • -2 Hide
    billcat479 , April 10, 2013 11:44 PM
    I wish I could have seen a lets say better test. I'm not great at knowing which games to pick and one thing I would have liked to see is games that are able to use more than one core of the cpu.
    I know when the AMD came out with the new FX it's single threading was still not up to intel's standards but in many of the tests that used more core's the AMD could actually keep up with Intel's cpus. Not as good all the time but it's very very easy to make these AMD cpu's look bad, just run a single threaded and/or older game at them and presto.
    I would have liked to see how the video card would play into this, if AMD's running more optimized software would it's crossfire come out better as well as it's overall effect in games that make use of this.
    I mean how long have we had more than one cpu core running now?
    And I know it's taken the software people to come up to speed but the game board is changing and they are starting more and more to use more than one core so do you think this would be important to check out as well. And just maybe see some new data from how the AMD can use video cards if it's running software it was really designed for?
    We can play this Intel single thread line till hell freezes over and we all know there will not be any surprises as long as we do.
    And we also have seen a shift if low cost game setups start to favor AMD's older cpus because there is more software that can run on more core's? So lets start to even out the playing field a bit here ok?
  • 18 Hide
    abbadon_34 , April 10, 2013 11:50 PM
    I'm curious why 12 month older drivers were used? Were the drivers considered irrelevent so old tests could be recycled ?
  • 8 Hide
    Crashman , April 11, 2013 12:02 AM
    esreverI am very confused by the article and the results.
    Just look at the chart and the last sentence of the last page. SLI outperforms CrossFire on AMD FX, CrossFire outperforms SLI on Core i7.
    CaptainTomI'm going to be honest, this article didn't prove anything for these reasons:-The i7 is stronger so of course it scaled better.-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
    Think about what you just said and then explain how that translates into the GeForce outperforming Radeon on an FX-8350, and Radeon outperforming GeForce on i7. It doesn't translate. The only thing that does translate is that slower processors favor Nvidia's architecture, which was the conclusion of the article.
    smeezekittyYou hit the nail right on the head.
    Except for the part where CaptainTom didn't. If his perception were reality, SLI would have still lost on both CPUs, and would have lost by a greater amount on the Intel CPU. The fact that SLI won on the AMD CPU cancels out his reasoning.
    abbadon_34I'm curious why 12 month older drivers were used? Were the drivers considered irrelevent so old tests could be recycled ?
    Doubling up? OK, the first test was done in January prior to Cat 13.1. Cat 12.11 was still in beta and was still causing problems in some of the tests. So 2-month-old drivers needed to be used then. After that, the cards were given away in a contest (apologies to everyone who didn't win). And that created the need to recycle the test data. After that, fairness dictated that old GeForce drivers were used.
  • 3 Hide
    sarinaide , April 11, 2013 12:11 AM
    It is true my A10 5800K runs slightly faster with a 680 opposed to a 7970, needless to say I just threw in some 6850's made life a lot simpler and enjoyable, had much better CFX scaling out of the 6850's than I did 7770's and 7850's, ol faithfuls.

  • 3 Hide
    avjguy2362 , April 11, 2013 12:28 AM
    I don't doubt the results here. However, a recent article in Anandtech talks about AMD being very aware that they have significant problems with their frame rate over time issues in crossfire and even with single cards. AMD is developing new tools similar to NVidia's, but of course to better see the issues with their own cards. They were using some Microsoft software, GPUView, that was highly detailed in giving a lot of data regarding their GPU's, but it is not specific enough for the problems they are having. Up until now they have made many FRAP improvements for single GPU's that have made a significant difference, but it may be several months before crossfire FRAP issues will be solved. So in fairness to AMD, it will be interesting to see if AMD makes the appropriate improvements and makes it public that they have addressed these problems in the near future. Near the end of the article: "AMD hasn’t fixed all of their issues yet and they waste no time admitting to it, so we will want to track their progress and see just how far along they are in bringing this issue under control."
  • 0 Hide
    silverblue , April 11, 2013 12:40 AM
    BigMack70I think AMD is just trolling gamers with their CPUs. Although they're definitely catching up to Intel while Intel just sorta sits there and doesn't do anything after declaring ultimate victory with Sandy Bridge.


    It sounds like Intel has long since reached the point of diminishing returns. AMD, on the other hand, have realised that they needed a slight departure from the road they were travelling on with Bulldozer and that, at least for the next two iterations of the architecture, they might make some decent gains - multithreading stands to be boosted decently with Steamroller, whilst Excavator will add further IPC gains as well as a big power drop. At least, that's the idea.
  • 1 Hide
    smeezekitty , April 11, 2013 12:41 AM
    Not to mention all the frame rates involved are so high it is really irrelevant.
  • 2 Hide
    Medjay , April 11, 2013 1:21 AM
    How about 3570k VS 8350 ?
    Same price huh ?
  • 4 Hide
    Crashman , April 11, 2013 1:52 AM
    MedjayHow about 3570k VS 8350 ?Same price huh ?


    There was no value score for these platforms, it was Intel's top mainstream CPU and AMD's top mainstream CPU. Nobody involved in the article cared about the AMD vs Intel argument, because the article was all about AMD's ability to support its own graphics cards from its own CPUs.
  • 0 Hide
    sarinaide , April 11, 2013 2:14 AM
    I am not really surprised or concerned about this as;

    1) The differential is so small
    2) AMD will never sacrifice lost sales due to special optimizations of its own parts which has the effect of driving off buyers.

    Either you use AMD CPU's or you don't, either you use AMD GPU's or you don't. As to value, AMD systems do well with mid level SLI and CFX setups but not so well on the higher end parts, but when you factor in the cost of building these systems, nobody really pairs top end AMD GPU's with their AMD CPU's anyways.

    I have 3 AMD systems, the FX8350 is running 6970's, the A10 is running 6850's and the 1100T is running 465's, My old Athlon 2 X4 is running 8800's and my i7 980X is running 580's. I have found with all those systems it has balance so I will not be tweaking around much. What one will notice is I don't use modern gen GPU's as I find that the quality of CFX and SLI is a bit down.
  • -2 Hide
    lolwhat , April 11, 2013 2:29 AM
    why don't you try something new... like getting 10 gamers put them to play in a blind test for 2 hour. 1 hour playing with 680 and 1 hour playing with a 7970... then ask about. seriously, those numbers and graphics out of scale... means like nothing to me, even more because you always try to say go Nvidia, then one month latter you say "best graphic card to buy isss 7970 trolololo"
Display more comments