Challenging FPS: Testing SLI And CrossFire Using Video Capture

Results: Tomb Raider

In perhaps the most dramatic finish of our nine-game suite, the Radeon cards encounter their largest drop in experiential performance, sacrificing 16.5 FPS on average.

Comparatively, the GeForce GTX 660 Tis turn back the same result for actual and practical frame rate.

Frame rate over time shows us just how many dropped and runt frames must be discarded for us to reach our practical frame rate.

Frame time variance is low from the GeForce cards, whereas the Radeon boards encounter notably more variance. Additionally, we noticed a bit of this during game play.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
198 comments
    Your comment
    Top Comments
  • cleeve
    kajunchickenHopefully someone besides Nvidia develops this technology. If no one does, Nvidia can charge whatever they want...


    FCAT isn't for end users, it's for review sites. The tech is supplied by hardware manufacturers, Nvidia just makes the scripts. They gave them to us for testing.
    20
  • DarkMantle
    Good review, but honestly I wouldnt use a tool touched by Nvidia to test AMD hardware, Nvidia has a track record of crippling the competition's hardware every chance they have. Also, i was checking prices in Newegg and to be honest the HD7870 is much cheaper than the GTX660Ti, why didn't you use 7870LE (Tahiti core) for this test? The price is much more closer.
    The problem i have with the hardware you picked for this reviews is that even though, RAW FPS are not the main idea behind the review, you are giving a Tool for every troll on the net to say AMD hardware or drivers are crap. The idea behind the review is good though.
    20
  • cangelini
    kajunchickenHopefully someone besides Nvidia develops this technology. If no one does, Nvidia can charge whatever they want...

    And actually, it'd be nice to see someone like Beepa incorporate the overlay functionality, taking Nvidia out of the equation.
    15
  • Other Comments
  • kajunchicken
    Hopefully someone besides Nvidia develops this technology. If no one does, Nvidia can charge whatever they want...
    -12
  • cleeve
    kajunchickenHopefully someone besides Nvidia develops this technology. If no one does, Nvidia can charge whatever they want...


    FCAT isn't for end users, it's for review sites. The tech is supplied by hardware manufacturers, Nvidia just makes the scripts. They gave them to us for testing.
    20
  • cangelini
    kajunchickenHopefully someone besides Nvidia develops this technology. If no one does, Nvidia can charge whatever they want...

    And actually, it'd be nice to see someone like Beepa incorporate the overlay functionality, taking Nvidia out of the equation.
    15
  • cravin
    I wish there were an easy way to make my frame rates not dip and spike so much. A lot of times it can go up to like 120fps but then dipts down to 60 -70. It makes it look super choppy and ugly. I know I could limit it at 60 frames a second, but wouldn't that just be like vsync? Would that have input lag.
    -10
  • DarkMantle
    Good review, but honestly I wouldnt use a tool touched by Nvidia to test AMD hardware, Nvidia has a track record of crippling the competition's hardware every chance they have. Also, i was checking prices in Newegg and to be honest the HD7870 is much cheaper than the GTX660Ti, why didn't you use 7870LE (Tahiti core) for this test? The price is much more closer.
    The problem i have with the hardware you picked for this reviews is that even though, RAW FPS are not the main idea behind the review, you are giving a Tool for every troll on the net to say AMD hardware or drivers are crap. The idea behind the review is good though.
    20
  • krneki_05
    Vsync would only cut out the frames above 60FPS, so you would still have the FPS drops. but 60 or 70 FPS is more then you need (unless you are using 120Hz monitor and you have super awesome eyes to see the difference between 60 and 120FPS, that some do). No, the choppy felling you have must be something else not the frames.
    10
  • BS Marketing
    Does it really matter? Over 60 FPS there will be screen tearing. So why is this sudden fuss? I guess nvidia marketing engine is in full flow. Only explanation is that Nvidia is really scared now, trying everything in there power to deceive people.
    -15
  • rojodogg
    I enjoyed the artical and it was very informative, I look forward to more testing of other GPU-s in the future.
    1
  • bystander
    Great article.

    But as great as the review is, I feel one thing that review sites have dropped the ball on is the lack of v-sync comparisons. A lot of people play with v-sync, and while a 60hz monitor is going to limit what you can test, you could get a 120hz or 144hz monitor and see how they behave with v-sync on.

    And the toughest thing of all, is how can microstutter be more accurately quantified. Not counting the runt frames gives a more accurate representation of FPS, but does not quantify microstutter that may be happening as a result.

    It seems the more info we get, the more questions I have.
    14
  • rene13cross
    @DarkMantle, exactly my thinking. I don't want to sound like a paranoid goof who thinks everything is a conspiracy but a test suite created by Nvidia to test AMD hardware doesn't sound like a very trustworthy test. I'm not saying that the results here are all false but Nvidia has had a slight history in the past with attempting to present the competition in an unfair light.
    1
  • bystander
    BigMack70It would also be interesting to see how much using a framerate limiter through something like Afterburner helps things... my experience is that framerate limiter + vsync = no (or almost no) perceivable stutter, even where it may have been really awful beforehand.That's a good deal of extra work and data to present, though...


    Yeah, that is a big part of why I'd like to see v-sync used in a review some time. It also removes tearing, and is the primary way I play; v-sync on a 120hz monitor.
    4
  • Onus
    This will take considerable time to digest, but my quick take on it so far is that I am very glad I have never wasted time or money on a [midrange] Crossfire setup. SLI looks a lot more viable, but nVidia is no less guilty than AMD of releasing the occasional bum drivers.
    Particularly after re-reading pp1-2, please clarify, runts [and drops] are not an issue in single-card setups?
    0
  • bystander
    rene13cross@DarkMantle, exactly my thinking. I don't want to sound like a paranoid goof who thinks everything is a conspiracy but a test suite created by Nvidia to test AMD hardware doesn't sound like a very trustworthy test. I'm not saying that the results here are all false but Nvidia has had a slight history in the past with attempting to present the competition in an unfair light.

    The test is one that AMD wanted as well. Well, at least that is what they are saying now, because it tests the output, not the start of the rendering process. I'm not sure how this type of test could skew results, as it just takes the frames like a monitor does, and shows us what the monitor shows.

    The part you could possibly quibble over is what quantifies a runt frame.
    6
  • ojas
    Don/Chris:

    There was a interesting AMD-backed story the other day on AT (not that it matters too much, both AMD and Nvidia seem to not like FRAPS that much), AMD apparently uses stuff like GPUView from MS.
    http://www.anandtech.com/show/6857/amd-stuttering-issues-driver-roadmap-fraps

    Intel's also got something called Graphics Performance Analyzers, which seems to be similar to GPUView.
    http://software.intel.com/en-us/vcsource/tools/intel-gpa
    2
  • ingtar33
    love the article, though it mostly measures the unmeasurable in ways the eye can't tell the difference. I appreciate the reviewer's honesty when he admitted only 2 titles where he could tell the poorer performing card in the test were performing poorer... this tells me the settings for the test... or what the test is measuring is largely undetectable and of questionable value.

    What needs to be done with FPS/FRAPs/whatever is a practical tested and verifiable standard needs to be created which accurately portrays the playable experience. sorta a meta rating which incorporates all these sub criteria into a number... which will let us know how silky smooth the play experience will be with a gaming title.

    of course, there is the added issue with an nvidia program being used to measure an AMD part... with the way intel used to (or might still according to some people) influence certain benching programs it's beyond problematic, especially with the way NVidia has played in the past with certain competition, to use a software program made by one of the competitors. if their methodology has value, it should be re-engineered to insure impartiality, and to prevent the obvious and expected fanboy mistrust.

    That said I agree with the author's general point... this is an exciting time to be an enthusiasts.
    2
  • nukemaster
    I too would like to see vsync comparisons.
    4
  • bystander
    krneki_05Vsync would only cut out the frames above 60FPS, so you would still have the FPS drops. but 60 or 70 FPS is more then you need (unless you are using 120Hz monitor and you have super awesome eyes to see the difference between 60 and 120FPS, that some do). No, the choppy felling you have must be something else not the frames.

    I read an article on Techreport recently that explained another big component to choppy game play is time syncing. One that no review site has ever tried to tackle. It is one thing to have evenly spaced frames, but what if those frames are not synced to the action? That would have more unsettling results.
    1
  • ojas
    Um, i'm still reading Batman:AC, two things:
    1. No FRAPS for Nvidia? How do we know FRAPS isn't causing an issue there?
    2. The Minimum FPS for the FRAPS measurement is actually lower than the hardware and practical. What's going on there, if FRAPS counts present() calls, then shouldn't it be more than the hardware FPS at the very least (unless i'm missing something, i think it should be the same at least).
    5
  • ubercake
    BS MarketingDoes it really matter? Over 60 FPS there will be screen tearing. So why is this sudden fuss? I guess nvidia marketing engine is in full flow. Only explanation is that Nvidia is really scared now, trying everything in there power to deceive people.

    No tearing on 120Hz monitors until you get over 120fps and even then tearing is no longer perceivable until you hit the mid 400s.

    Also, that is not the point of the article.

    This is a great article. It's consistent with others I've read on the subject. It is consistent to what is being published regarding information AMD is also supporting.

    I look forward to seeing what you do with the tweaks of the FCAT software to further define what equates to a "runt" frame. Seems like that could make an even greater difference. Defining a runt frame seems somewhat subjective. Seems like many more than 21 scan lines or less could define a runt and would seem dependent on the resolution somewhat?
    0
  • blazorthon
    BS MarketingDoes it really matter? Over 60 FPS there will be screen tearing. So why is this sudden fuss? I guess nvidia marketing engine is in full flow. Only explanation is that Nvidia is really scared now, trying everything in there power to deceive people.


    You can get screen tearing regardless of what FPS you have. You might get it even at 40FPS and you might not get it at 200FPS. Just because you're over 60FPS doesn't necessarily mean that you'll have screen tearing just as being under 60FPS doesn't mean that you won't get screen tearing.
    0