AMD Radeon HD 7990: Eight Games And A Beastly Card For $1,000

Test Setup, An Explanation Of FCAT, And Benchmarks

Swipe to scroll horizontally
Test Hardware
ProcessorsIntel Core i7-3770K (Ivy Bridge) 3.5 GHz at 4.0 GHz (40 * 100 MHz), LGA 1155, 8 MB Shared L3, Hyper-Threading enabled, Power-savings enabled
MotherboardGigabyte Z77X-UD5H (LGA 1155) Z77 Express Chipset, BIOS F15q
MemoryG.Skill 16 GB (4 x 4 GB) DDR3-1600, F3-12800CL9Q2-32GBZL @ 9-9-9-24 and 1.5 V
Hard DriveCrucial m4 SSD 256 GB SATA 6Gb/s
GraphicsAMD Radeon HD 7990 6 GB
Row 5 - Cell 0 AMD Radeon HD 7970 GHz Edition 3 GB
Row 6 - Cell 0 Nvidia GeForce GTX 690 4 GB
Row 7 - Cell 0 Nvidia GeForce GTX 680 2 GB
Row 8 - Cell 0 Nvidia GeForce GTX Titan 6 GB
Power SupplyCooler Master UCP-1000 W
System Software And Drivers
Operating SystemWindows 8 Professional 64-bit
DirectXDirectX 11
Graphics DriverAMD Catalyst 13.5 (Beta 2)
Row 14 - Cell 0 Nvidia GeForce Release 320.00
Row 15 - Cell 0 AMD Catalyst Frame_Pacing_Prototype v2 For Radeon HD 7990

Our work with Nvidia’s Frame Capture Analysis Tools last month yielded interesting information, and it continues to shape the way we plan to test multi-GPU configurations moving forward. Because it’s such a departure from the Fraps-based benchmarking we’ve done in the past, though, today’s review includes more than just FCAT-generated data. We’re also bringing a handful of gamers to our SoCal lab to go hands-on with Radeon HD 7990 and GeForce GTX 690 in eight different titles. What we’re hoping to achieve is unprecedentedly comprehensive performance data using FCAT, and then the real-world “reality check” from gaming enthusiasts. We want to know if this new emphasis on latency between successive frames maps to the actual gaming experience.

At the same time, we recognize that the new data we’re generating is far more sophisticated than the simple average frame rates that previously made it easy to pit two graphics cards against each other. Fortunately, we still have average results to report, along with frame rates over time. The newest addition is frame time variance. We’ve heard that this isn’t as explanatory as we’d hoped, so we have the following explanation to help clarify.

Why aren’t we simply presenting frame times, as other sites are? Because we feel that raw frame time data includes too many variables for us to draw the right conclusions.

For example, a 40-millisecond frame sounds pretty severe. Is this indicative of stuttery playback? It might, and it might not. Take the following two scenarios:

First, how would your game look if that 40-ms frame was surrounded on both sides by other frames that took the same amount of time to render? The resulting frame rate would be a very consistent 25 FPS, and you might not notice any stuttering at all. We wouldn’t call that frame rate ideal, but the even pacing would certainly help experientially.

Then consider the same 40-ms frame in a sea of 16.7-ms frames. In this case, the longer frame time would take more than twice as long as the frames before and after it, likely standing out as a stutter artifact of some sort.

Yes, the hypothetical is simplified for our purposes. But the point remains; if you want to call out stuttering in a game, you need more context than raw frame times. You also need to consider the frames around those seemingly-higher ones. So, we came up with something called frame time variance.

We’re basically looking at each frame and coming to a conclusion whether it’s out of sync with the field of frames before and after it. In the first example, our 40-ms frame surrounded by other 40-ms frames would register a frame time variance of zero. In our second example, the 40-ms frame surrounded by 16.7-ms frames would be reported as a variance of 23.3 ms.

Experimentation with this in the lab continues. But from what we’ve seen, gamers are noticing changes as small as 15 ms. Therefore, this is our baseline. If frame time variance is under 15 ms, a single frame probably won’t cause a perceptible artifact. If the average variance approaches 15 ms, with spikes in excess, it’d be reasonable to expect a gamer to report stuttering issues.

The actual Excel formula we’re using on frame times listed chronologically from top to bottom is as follows:

=ABS(B20-(TRIMMEAN(B2:B38, 0.3)))     //The formula describes the frame time variance for the 20th frame in a capture, listed in cell B20.

Breaking this down, the formula looks at frame time values starting 18 cells in front of and 18 cells behind the targeted frame, and then averages them out (excluding 30% of the outliers so that the average isn’t affected by anomalous results). This average frame time is then subtracted from the current frame time. The result is fed back as an absolute, or positive value.

We’re always hoping to see frame time variance of zero. In reality, though, there is always some variation one way or the other. So, we look across the spectrum and report average, 75th, and 95th percentile values.

I know—sounds like it gets pretty intense. But you’re going to see some pretty cool details from the nearly 1.5 TB of video we captured from AMD’s Radeon HD 7990, two Radeon HD 7970s in CrossFire, the Nvidia GeForce GTX 690, GeForce GTX Titan, and two GeForce GTX 680s in SLI. All of the testing was done at 2560x1440, and we’re using eight different games to represent each solution’s performance.

Swipe to scroll horizontally
Benchmarks And Settings
Battlefield 3Ultra Quality Preset, v-sync off, 2560x1440, DirectX 11, Going Hunting, 90-Second playback, FCAT
Far Cry 3Ultra Quality Preset, DirectX 11, v-sync off, 2560x1440, Custom Run-Through, 50-Second playback, FCAT
Borderlands 2Highest-Quality Settings, PhysX Low, 16x Anisotropic Filtering, 2560x1440, Custom Run-Through, FCAT
Hitman: AbsolutionUltra Quality Preset, MSAA Off, 2560x1440, Built-In Benchmark Sequence, FCAT
The Elder Scrolls V: SkyrimUltra Quality Preset, FXAA Enabled, 2560x1440, Custom Run-Through, 25-Second playback, FCAT
3DMarkFire Strike Benchmark
BioShock InfiniteUltra Quality Settings, DirectX 11, Diffusion Depth of Field, 2560x1440, Built-in Benchmark Sequence, FCAT
Crysis 3Very High System Spec, MSAA: Low (2x), High Texture Resolution, 2560x1440, Custom Run-Through, 60-Second Sequence, FCAT
Tomb RaiderUltimate Quality Preset, FXAA Enabled, 16x Anisotropic Filtering, TressFX Hair, 2560x1440, Custom Run-Through, 45-Second Sequence, FCAT
LuxMark 2.064-bit Binary, Version 2.0, Sala Scene
SiSoftware Sandra 2013 ProfessionalSandra Tech Support (Engineer) 2013.SP1, Cryptography, Financial Analysis Performance
Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • blackmagnum
    If I had 1,000 dollars... I would buy a Titan. Its power efficiency, drivers and uber-chip goodness is unmatched.
    Reply
  • whyso
    Power usage?

    Thats some nice gains from the prototype driver.
    Reply
  • ilysaml
    Nice article!! Unbeatable performance out of the box.
    Reply
  • 17seconds
    Sort of seems like a mess to me. The game bundle is nice.
    Reply
  • timw03878
    Here's an idea. Take away the 8 games at 40 bucks a piece and deduct that from the insane 1000 price tag.
    Reply
  • donquad2001
    this test was 99% useless to the average gamer,Test the card at 1900x1080 like most of us use to get a real ideal of what its like,only your unigine benchmarks helped the average gamer,who cares what any card can do at a resolution we cant use anyway?
    Reply
  • cangelini
    whysoPower usage?Thats some nice gains from the prototype driver.Power is the one thing I didn't have time for. We already know the 7990 is a 375 W card, while GTX 690 is a 300 W card, though. We also know AMD has Zero Core, which is going to shave off power at idle with one GPU shut off. I'm not expecting any surprises on power that those specs and technologies don't already insinuate.
    Reply
  • ASHISH65
    nice article! here comes the Competitor of gtx 690!
    Reply
  • cangelini
    donquad2001this test was 99% useless to the average gamer,Test the card at 1900x1080 like most of us use to get a real ideal of what its like,only your unigine benchmarks helped the average gamer,who cares what any card can do at a resolution we cant use anyway?If you're looking to game at 1920x1080, I can save you a ton of money by recommending something less than half as expensive. This card is for folks playing at 2560 *at least.* Next time, I'm looking to get FCAT running on a 7680x1440 array ;)
    Reply
  • hero1
    Nice article. I was hopping that they would have addressed the whining but they haven't and that's a shame. Performance wise it can be matched by GTX 680 SLI and GTX 690 without the huge time variance and runt frames. Let's hope they fix their whining issue and FPS without forcing users to turn on V-sync. For now I know where my money is going consider that I have dealt with AMD before:XFX and Sapphire and didn't like the results (whining, artifacts, XF stops working etc). Sorry but I gave the red team a try and I will stick with Nvidia until AMD can prove that they have fixed their issues.
    Reply