Sign in with
Sign up | Sign in

Test Setup, An Explanation Of FCAT, And Benchmarks

AMD Radeon HD 7990: Eight Games And A Beastly Card For $1,000
By , Igor Wallossek
Test Hardware
Processors
Intel Core i7-3770K (Ivy Bridge) 3.5 GHz at 4.0 GHz (40 * 100 MHz), LGA 1155, 8 MB Shared L3, Hyper-Threading enabled, Power-savings enabled
Motherboard
Gigabyte Z77X-UD5H (LGA 1155) Z77 Express Chipset, BIOS F15q
Memory
G.Skill 16 GB (4 x 4 GB) DDR3-1600, F3-12800CL9Q2-32GBZL @ 9-9-9-24 and 1.5 V
Hard Drive
Crucial m4 SSD 256 GB SATA 6Gb/s
Graphics
AMD Radeon HD 7990 6 GB

AMD Radeon HD 7970 GHz Edition 3 GB

Nvidia GeForce GTX 690 4 GB

Nvidia GeForce GTX 680 2 GB

Nvidia GeForce GTX Titan 6 GB
Power Supply
Cooler Master UCP-1000 W
System Software And Drivers
Operating System
Windows 8 Professional 64-bit
DirectX
DirectX 11
Graphics DriverAMD Catalyst 13.5 (Beta 2)

Nvidia GeForce Release 320.00

AMD Catalyst Frame_Pacing_Prototype v2 For Radeon HD 7990


Our work with Nvidia’s Frame Capture Analysis Tools last month yielded interesting information, and it continues to shape the way we plan to test multi-GPU configurations moving forward. Because it’s such a departure from the Fraps-based benchmarking we’ve done in the past, though, today’s review includes more than just FCAT-generated data. We’re also bringing a handful of gamers to our SoCal lab to go hands-on with Radeon HD 7990 and GeForce GTX 690 in eight different titles. What we’re hoping to achieve is unprecedentedly comprehensive performance data using FCAT, and then the real-world “reality check” from gaming enthusiasts. We want to know if this new emphasis on latency between successive frames maps to the actual gaming experience.

At the same time, we recognize that the new data we’re generating is far more sophisticated than the simple average frame rates that previously made it easy to pit two graphics cards against each other. Fortunately, we still have average results to report, along with frame rates over time. The newest addition is frame time variance. We’ve heard that this isn’t as explanatory as we’d hoped, so we have the following explanation to help clarify.

Why aren’t we simply presenting frame times, as other sites are? Because we feel that raw frame time data includes too many variables for us to draw the right conclusions.

For example, a 40-millisecond frame sounds pretty severe. Is this indicative of stuttery playback? It might, and it might not. Take the following two scenarios:

First, how would your game look if that 40-ms frame was surrounded on both sides by other frames that took the same amount of time to render? The resulting frame rate would be a very consistent 25 FPS, and you might not notice any stuttering at all. We wouldn’t call that frame rate ideal, but the even pacing would certainly help experientially.

Then consider the same 40-ms frame in a sea of 16.7-ms frames. In this case, the longer frame time would take more than twice as long as the frames before and after it, likely standing out as a stutter artifact of some sort.

Yes, the hypothetical is simplified for our purposes. But the point remains; if you want to call out stuttering in a game, you need more context than raw frame times. You also need to consider the frames around those seemingly-higher ones. So, we came up with something called frame time variance.

We’re basically looking at each frame and coming to a conclusion whether it’s out of sync with the field of frames before and after it. In the first example, our 40-ms frame surrounded by other 40-ms frames would register a frame time variance of zero. In our second example, the 40-ms frame surrounded by 16.7-ms frames would be reported as a variance of 23.3 ms.

Experimentation with this in the lab continues. But from what we’ve seen, gamers are noticing changes as small as 15 ms. Therefore, this is our baseline. If frame time variance is under 15 ms, a single frame probably won’t cause a perceptible artifact. If the average variance approaches 15 ms, with spikes in excess, it’d be reasonable to expect a gamer to report stuttering issues.

The actual Excel formula we’re using on frame times listed chronologically from top to bottom is as follows:

=ABS(B20-(TRIMMEAN(B2:B38, 0.3)))     //The formula describes the frame time variance for the 20th frame in a capture, listed in cell B20.

Breaking this down, the formula looks at frame time values starting 18 cells in front of and 18 cells behind the targeted frame, and then averages them out (excluding 30% of the outliers so that the average isn’t affected by anomalous results). This average frame time is then subtracted from the current frame time. The result is fed back as an absolute, or positive value.

We’re always hoping to see frame time variance of zero. In reality, though, there is always some variation one way or the other. So, we look across the spectrum and report average, 75th, and 95th percentile values.

I know—sounds like it gets pretty intense. But you’re going to see some pretty cool details from the nearly 1.5 TB of video we captured from AMD’s Radeon HD 7990, two Radeon HD 7970s in CrossFire, the Nvidia GeForce GTX 690, GeForce GTX Titan, and two GeForce GTX 680s in SLI. All of the testing was done at 2560x1440, and we’re using eight different games to represent each solution’s performance.

Benchmarks And Settings
Battlefield 3
Ultra Quality Preset, v-sync off, 2560x1440, DirectX 11, Going Hunting, 90-Second playback, FCAT
Far Cry 3
Ultra Quality Preset, DirectX 11, v-sync off, 2560x1440, Custom Run-Through, 50-Second playback, FCAT
Borderlands 2
Highest-Quality Settings, PhysX Low, 16x Anisotropic Filtering, 2560x1440, Custom Run-Through, FCAT
Hitman: Absolution
Ultra Quality Preset, MSAA Off, 2560x1440, Built-In Benchmark Sequence, FCAT
The Elder Scrolls V: Skyrim
Ultra Quality Preset, FXAA Enabled, 2560x1440, Custom Run-Through, 25-Second playback, FCAT
3DMark
Fire Strike Benchmark
BioShock Infinite
Ultra Quality Settings, DirectX 11, Diffusion Depth of Field, 2560x1440, Built-in Benchmark Sequence, FCAT
Crysis 3
Very High System Spec, MSAA: Low (2x), High Texture Resolution, 2560x1440, Custom Run-Through, 60-Second Sequence, FCAT
Tomb Raider
Ultimate Quality Preset, FXAA Enabled, 16x Anisotropic Filtering, TressFX Hair, 2560x1440, Custom Run-Through, 45-Second Sequence, FCAT
LuxMark 2.0
64-bit Binary, Version 2.0, Sala Scene
SiSoftware Sandra 2013 ProfessionalSandra Tech Support (Engineer) 2013.SP1, Cryptography, Financial Analysis Performance
Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 131 comments.
This thread is closed for comments
Top Comments
  • 26 Hide
    cangelini , April 23, 2013 9:53 PM
    donquad2001this test was 99% useless to the average gamer,Test the card at 1900x1080 like most of us use to get a real ideal of what its like,only your unigine benchmarks helped the average gamer,who cares what any card can do at a resolution we cant use anyway?

    If you're looking to game at 1920x1080, I can save you a ton of money by recommending something less than half as expensive. This card is for folks playing at 2560 *at least.* Next time, I'm looking to get FCAT running on a 7680x1440 array ;) 
  • 23 Hide
    timw03878 , April 23, 2013 9:47 PM
    Here's an idea. Take away the 8 games at 40 bucks a piece and deduct that from the insane 1000 price tag.
  • 12 Hide
    17seconds , April 23, 2013 9:37 PM
    Sort of seems like a mess to me. The game bundle is nice.
Other Comments
  • 12 Hide
    whyso , April 23, 2013 9:36 PM
    Power usage?

    Thats some nice gains from the prototype driver.
  • 12 Hide
    17seconds , April 23, 2013 9:37 PM
    Sort of seems like a mess to me. The game bundle is nice.
  • 23 Hide
    timw03878 , April 23, 2013 9:47 PM
    Here's an idea. Take away the 8 games at 40 bucks a piece and deduct that from the insane 1000 price tag.
  • 10 Hide
    cangelini , April 23, 2013 9:51 PM
    whysoPower usage?Thats some nice gains from the prototype driver.

    Power is the one thing I didn't have time for. We already know the 7990 is a 375 W card, while GTX 690 is a 300 W card, though. We also know AMD has Zero Core, which is going to shave off power at idle with one GPU shut off. I'm not expecting any surprises on power that those specs and technologies don't already insinuate.
  • -4 Hide
    ASHISH65 , April 23, 2013 9:51 PM
    nice article! here comes the Competitor of gtx 690!
  • 26 Hide
    cangelini , April 23, 2013 9:53 PM
    donquad2001this test was 99% useless to the average gamer,Test the card at 1900x1080 like most of us use to get a real ideal of what its like,only your unigine benchmarks helped the average gamer,who cares what any card can do at a resolution we cant use anyway?

    If you're looking to game at 1920x1080, I can save you a ton of money by recommending something less than half as expensive. This card is for folks playing at 2560 *at least.* Next time, I'm looking to get FCAT running on a 7680x1440 array ;) 
  • 2 Hide
    hero1 , April 23, 2013 9:54 PM
    Nice article. I was hopping that they would have addressed the whining but they haven't and that's a shame. Performance wise it can be matched by GTX 680 SLI and GTX 690 without the huge time variance and runt frames. Let's hope they fix their whining issue and FPS without forcing users to turn on V-sync. For now I know where my money is going consider that I have dealt with AMD before:XFX and Sapphire and didn't like the results (whining, artifacts, XF stops working etc). Sorry but I gave the red team a try and I will stick with Nvidia until AMD can prove that they have fixed their issues.
  • 1 Hide
    ohim , April 23, 2013 10:02 PM
    Why are all you people, that this card is not made for, complain about the price tag? AMD / Nvidia for sure don`t really make a profit if any out of these monsters. They are just for show like in the CPU business.
    People mostly buy Intel (I3/i5 a lot more than i7) just because Intel can provide top of the line CPUs in the i7 Extreme range. Same goes here, if some hears that AMD has a better 1000$ card than Nvidia, they will probably spend 100-200$ for an AMD card and not Nvidia.
    Power ... unless you`re not a guy who saves 2 years in a row for this card to have a 6 months nerd gaming glory you won`t care that much how power hungry this card is.

    Is just like asking Ferrari or Lamborghini how many mpg their cars do.
  • 6 Hide
    mayankleoboy1 , April 23, 2013 10:14 PM
    1.I wonder if inserting all those pauses in the rendering pipeline for smoothness harms the compute performance.

    2. Regarding the fan noise and the hum : It would be interesting to know how much noticable is the fan noise and the hum with increaseing listner distance. IOW, which noise is more noticable at near/medium/far distances ?


    Drivers still are AMD's biggest weakness. I would have expected AMD to havetop-notch , A-one drivers to go with the HD7990. After all, this is AMD's halo product. The first impression is what matters. The conclusion is basically "Card is good. Drivers are poor, with better coming in future". So ultimately its selling a promise, which may/may not succeed. It appears to me that AMD doesnt value its own products.
  • -1 Hide
    mayankleoboy1 , April 23, 2013 10:18 PM
    Ohh, and a video conversion test would have been nice too. (Is there any software available that supports CFX ?)
    Also, has the Video Conversion Engine in AMD taken off ?
  • 0 Hide
    jezus53 , April 23, 2013 10:26 PM
    It's very interesting that AMD couldn't find a capacitor that wouldn't cause this noise. I feel once third party vendors get the reference they'll find ways of removing that. Hope they fix it soon or else nVidia will have a new line of cards while AMD is having problems with neatly two year old chips!!!
  • 1 Hide
    dragonsqrrl , April 23, 2013 10:41 PM
    hero1Nice article. I was hopping that they would have addressed the whining but they haven't and that's a shame. Performance wise it can be matched by GTX 680 SLI and GTX 690 without the huge time variance and runt frames. Let's hope they fix their whining issue and FPS without forcing users to turn on V-sync. For now I know where my money is going consider that I have dealt with AMD before:XFX and Sapphire and didn't like the results (whining, artifacts, XF stops working etc). Sorry but I gave the red team a try and I will stick with Nvidia until AMD can prove that they have fixed their issues.

    Unfortunately I'm really not sure the whining issue is something that can be fixed with a driver update. I think it has more to do with the hardware on the board than anything else. But it's good to see that AMD has finally recognized the frame time variance and micro-stutter problem, and are actively pursuing a solution. Although the test in the review was limited, I think it's telling that every gamer tested was able to recognize the difference between AMD and Nvidia cards, and even the difference brought by AMD's own prototype drivers.
  • 2 Hide
    hero1 , April 23, 2013 10:51 PM
    dragonsqrrlUnfortunately I'm really not sure the whining issue is something that can be fixed with a driver update. I think it has more to do with the hardware on the board than anything else. But it's good to see that AMD has finally recognized the frame time variance and micro-stutter problem, and are actively pursing a solution. Although the test in the review was limited, I think it's telling that every gamer tested was able to recognize the difference between AMD and Nvidia cards, and even the difference brought by AMD's own prototype drivers.


    I know and that's what I meant by hopping that they would have addressed the whining with this card. It happens to all their cards, well the ones that I have owned especially the XFX and if they knew what causes then they should have fixed it.

    Let's hope that the prototype driver will also translate to better drivers for all their GPUs and address the frame rate issues. Other than that, it is a good card but I think, for my personal use since I was waiting to see what this can offer, I will just get the GTX 680 or the GTX 780 next month and will definitely go back to AMD if they address those issues.
  • 2 Hide
    dragonsqrrl , April 23, 2013 10:54 PM
    whysoPower usage?Thats some nice gains from the prototype driver.


    For everyone seeking power and heat results:
    http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official/16

    It consumes a lot of power under load, substantially more than the GTX690, but like Chris said that's to be expected. The big difference with the 7990 seems to be acoustics in relation to temps at load. It's a massive improvement over the 6990, and pretty much on par with the GTX690. Unfortunately the coil whine seems to undo a lot of the improvements made to the stock cooler, but all things considered it's pretty impressive what AMD was able to do in this area, especially in comparison to unofficial solutions from other vendors (dual slot, only requires 2 8-pin).
  • 2 Hide
    dragonsqrrl , April 23, 2013 10:57 PM
    hero1I know and that's what I meant by hopping that they would have addressed the whining with this card.

    Sorry, that's a reading fail on my part. Thought you said, 'hope they'll address the whining' or something to that effect.
  • 2 Hide
    hero1 , April 23, 2013 11:00 PM
    dragonsqrrlSorry, that's a reading fail on my part. Thought you said, 'hope they'll address the whining' or something to that effect.


    No problem.
  • 5 Hide
    bartholomew , April 23, 2013 11:32 PM
    A Price tag of $849.99 would had been quite aggressive & increased its value significantly.
Display more comments