Benchmarking GeForce GTX Titan 6 GB: Fast, Quiet, Consistent

We've already covered the features of Nvidia's GeForce GTX Titan, the $1,000 GK110-powered beast set to exist alongside GeForce GTX 690. Now it's time to benchmark the board in one-, two-, and three-way SLI. Is it better than four GK104s working together?

Two days ago, we gave you our first look at a beastly single-GPU graphics board in Nvidia GeForce GTX Titan 6 GB: GK110 On A Gaming Card. For some reason, the company wanted to split discussion of Titan’s specs and its performance across two days. That’s not the direction I would have gone (had I been asked for my opinion, that is). But after knocking out several thousand words on the first piece and benchmarking for a week straight in the background, cutting our coverage in half didn’t bother me as much.

If you missed the first piece, though, pop open a new tab and check it out; that story lays the foundation for the numbers you’re going to see today.

I’m not going to waste any time rehashing the background on GeForce GTX Titan. In short, we ended up with a trio of the GK110-based boards. One was peeled off to compare against GeForce GTX 690, 680, and Radeon HD 7970 GHz Edition. Then, we doubled- and tripled-up the Titans to see how they’d fare against GeForce GTX 690s in four-way SLI. Of course, compute was important to revisit, so I spent some time digging into that before measuring power consumption, noise, and temperatures.

Let’s jump right into the system we used for benchmarking, the tests we ran, and the way we’re reporting our results, since it differs from what you’ve seen us do in the past.

Test Hardware
Intel Core i7-3970X (Sandy Bridge-E) 3.5 GHz at 4.5 GHz (45 * 100 MHz), LGA 2011, 15 MB Shared L3, Hyper-Threading enabled, Power-savings enabled
Intel DX79SR (LGA 2011) X79 Express Chipset, BIOS 0553
G.Skill 16 GB (4 x 4 GB) DDR3-1600, F3-12800CL9Q2-32GBZL @ 9-9-9-24 and 1.5 V
Hard Drive
Crucial m4 SSD 256 GB SATA 6Gb/s
Nvidia GeForce GTX Titan 6 GB

Nvidia GeForce GTX 690 4 GB

Nvidia GeForce GTX 680 2 GB

AMD Radeon HD 7970 GHz Edition 3 GB
Power Supply
Cooler Master UCP-1000 W
System Software And Drivers
Operating System
Windows 8 Professional 64-bit
DirectX 11
Graphics DriverNvidia GeForce Release 314.09 (Beta) For GTX Titan

Nvidia GeForce Release 314.07 For GTX 680 and 690

AMD Catalyst 13.2 (Beta 5) For Radeon HD 7970 GHz Edition

We ran into a snag right away when the Gigabyte X79S-UP5-WiFi motherboard we use as our test bench proved incompatible with Titan. Neither Nvidia nor Gigabyte was able to explain why the card wouldn’t output a video signal, though the system seemed to boot otherwise.

Switching out for Intel’s DX79SR solved the issue. So, we overclocked a Core i7-3970X to 4.5 GHz, dropped in 32 GB of DDR3-1600 from G.Skill, and installed all of our apps on a 256 GB Crucial m4 to avoid bottlenecks in every possible way.

One thing to keep in mind about GPU Boost 2.0: because the technology is now temperature-based, it's even more sensitive to environmental influence. We monitored the feature's behavior across a number of games and, left untouched, core clock rates tended to stick around 993 MHz. A higher allowable thermal ceiling easily allowed them to approach 1.1 GHz. As you might imagine, the difference between benchmarks run on a cold GPU, peaking at 1.1 GHz, and a hot chip in a warm room can vary significantly. We made sure to maintain a constant 23 degrees Celsius in our lab, only recording benchmark results after a warm-up run.

Benchmarks And Settings
Battlefield 3
Ultra Quality Preset, V-Sync off, 1920x1080 / 2560x1600 / 5760x1200, DirectX 11, Going Hunting, 90-Second playback, Fraps
Far Cry 3
Ultra Quality Preset, DirectX 11, V-Sync off, 1920x1080 / 2560x1600 / 5760x1200, Custom Run-Through, 50-Second playback, Fraps
Borderlands 2
Highest-Quality Settings, PhysX Low, 16x Anisotropic Filtering, 1920x1080 / 2560x1600 / 5760x1200, Custom Run-Through, Fraps
Hitman: Absolution
Ultra Quality Preset, 2x MSAA, 1920x1080 / 2560x1600 / 5760x1200, Built-In Benchmark Sequence
The Elder Scrolls V: Skyrim
Ultra Quality Preset, FXAA Enabled, 1920x1080 / 2560x1600 / 5760x1200, Custom Run-Through, 25-Second playback, Fraps
Fire Strike Benchmark
World of Warcraft: Mists of Pandaria
Ultra Quality Settings, 8x MSAA, Mists of Pandaria Flight Points, 1920x1200 / 2560x1600 / 5760x1200, Fraps, DirectX 11 Rendering, x64 Client
SiSoftware Sandra 2013 Professional
Sandra Tech Support (Engineer) 2013.SP1, GP Processing, Cryptography, Video Shader, and Video Bandwidth Modules
Corel WinZip 17
2.1 GB Folder, OpenCL Vs. CPU Compression
LuxMark 2.0
64-bit Binary, Version 2.0, Room Scene
Adobe Photoshop CS6
Scripted Filter Test, OpenCL Enabled, 16 MB TIF

Using one Dell 3007WFP and three Dell U2410 displays, we were able to benchmark our suite at 1920x1080, 2560x1600, and 5760x1200.

In the past, we would have presented a number of average frame rates for each game, at different resolutions, with and without anti-aliasing. Average FPS remains a good measurement to present, particularly for the ease with which it conveys relative performance. But we know there is plenty of valuable information missing still.

Last week, Don introduced a three-part approach to breaking down graphics performance in Gaming Shoot-Out: 18 CPUs And APUs Under $200, Benchmarked. The first component involves average frame rate, so our old analysis is covered. Second, we have frame rate over time, plotted as a line graph. This shows you how high and low the frame rate goes during our benchmark run. It also illustrates how long a given configuration spends in comfortable (or unplayable) territory. The third chart measures the lag between consecutive frames. When this number is high, even if your average frame rate is solid, you’re more likely to “feel” jittery gameplay. So, we’re calculating the average time difference, the 75th percentile (the longest lag between consecutive frames 75 percent of the time), and the 95th percentile.

All three charts complement each other, conveying the number of frames each card is able to output each second, and how consistently those frames show up. The good news is that we think this combination of data tells a compelling story. Unfortunately, instead of one chart for every resolution we test in a game, we now have three charts per resolution per game, which is a lot to wade through. As we move through today’s story, we’ll do our best to explain what all of the information means.

This thread is closed for comments
    Your comment
  • jimbaladin
    For $1000 that card sheath better be made out of platinum.
  • aofjax
    Lol, $1000.
  • Novuake
    Pure marketing. At that price Nvidia is just pulling a huge stunt... Still an insane card.
  • whyso
    if you use an actual 7970 GE card that is sold on newegg, etc instead of the reference 7970 GE card that AMD gave (that you can't find anywhere) thermals and acoustics are different.
  • cknobman
    Seems like Titan is a flop (at least at $1000 price point).

    This card would only be compelling if offered in the ~$700 range.

    As for compute? LOL looks like this card being a compute monster goes right out the window. Titan does not really even compete that well with a 7970 costing less than half.
  • downhill911
    If titan costs no more than 800USD, then really nice card to have since it does not, i call it a fail card, or hype card. Even my GTX 690 make more since and now you can have them for a really good price on ebay.
  • spookyman
    well I am glad I bought the 690GTX.

    Titan is nice but not impressive enough to go buy.
  • spentshells
    I feel 2 7970's should have been included in the multi card setups.
  • hero1
    jimbaladinFor $1000 that card sheath better be made out of platinum.

    Tell me about it! I think Nvidia shot itself on the foot with the pricing schim. I want AMD to come out with better drivers than current ones to put the 7970 at least 20% ahead of 680 and take all the sales from the greedy green. Sure it performs way better but that price is insane. I think 700-800 is the sweet spot but again it is rare, powerful beast and very consistent which is hard to find atm.
  • raxman
    "We did bring these issues up with Nvidia, and were told that they all stem from its driver. Fortunately, that means we should see fixes soon." I suspect their fix will be "Use CUDA".

    Nvidia has really dropped the ball on OpenCL. They don't support OpenCL 1.2, they make it difficult to find all their OpenCL examples. Their link for OpenCL is not easy to find. However their OpenCL 1.1 driver is quite good for Fermi and for the 680 and 690 despite what people say. But if the Titan has troubles it looks like they will be giving up on the driver now as well or purposely crippling it (I can't imagine they did not think to test some OpenCL benchmarks which every review site uses). Nvidia does not care about OpenCL Nvidia users like myself anymore. I wish there more people influential like Linus Torvalds that told Nvidia where to go.
  • realibrad
    Titan is made for a very small segment. The Microstutter issue for high end systems is very annoying, becuase you spent thousands, and at that point, it should work perfectly. A 690 will kill just about any game, but it does have microstutter issues. Why not get a Titan, who may have a slightly lower FPS, but a much better over all game.

    The Titan has a much smoother feel with the lows being better, and micro stutter almost completly gone. Now, if you go triple SLI with a 690, micro stutter is gone, but you are likely to do insane resolutions, and the lower amount of memory will bit you.

    The only reason to get a 690, is if you plan to only get 1, because you cant afford 2k in titans.
  • aofjax
    I wonder if I can build a whole new $1000 rig that can match/exceed a single Titan.....
  • Memnarchon
    Actually this card seems to be an engineer miracle (comparing GF110 vs GK110 its almost twice performance from one generation to an other). The frame latencies from such a gaming beast are also impressive.
    But its a single gpu ffs. Cut the memory to 3GB 384bit GDDR5 keep the SMX at 14/15 (2,688 cores) and priced it $750. (Its expensive again but it would sell a lot more)
    Then take a full GK110 15/15 (2,880) SMX with 6GB 384bit GDDR5 give it 100Mhz more on the core and name Titan Ultra at $1000+...
    Everyone is happy then.
  • outlw6669
    I was defiantly expecting more from Titan, especially for that $1000 price tag.
    Really, I would like to see it priced around $650ish before it is considered competitive.

    Also, way to go AMD; your GCN arch really is the king of compute!
    I was honestly expecting Titan to tromp the HD 7970 GHz and am pleasantly surprised.
    Future APU's with GCN onboard is looking better and better :)
  • Memnarchon
    outlw6669I was defiantly expecting more from Titan, especially for that $1000 price tag.Really, I would like to see it priced around $650ish before it is considered competitive.Also, way to go AMD; your GCN arch really is the king of compute!I was honestly expecting Titan to tromp the HD 7970 GHz and am pleasantly surprised.Future APU's with GCN onboard is looking better and better

    Actually Chris Angelini has already answered this.
    Chris AngeliniWe did bring these issues up with Nvidia, and were told that they all stem from its driver. Fortunately, that means we should see fixes soon.

    Also Anandtech made some tests also and revealed that Titan is better at compute power: Anandtech
  • Hellbound
    The card is not worth $1000.. $800 should have been the price point.
  • ilysaml
    People who say that Titan should be sold for $800, my question is why the hell should it be even sold @ the $800 range? It's not even twice faster than HD 7970 or GTX 680, in most cases it's 35%...that card should be $100-200$ more than a GTX 680/HD 7970 GHz. For me this card is just good for it's appearance.
  • blubbey
    People are missing the entire point of this. This is not a card for you or I. This is not a value card. This is at $1k for pure profit. This is an e-peen card, pure and simple.
  • mayankleoboy1
    Would have liked some video conversion bemchmarks too.
    And i hope you are planning a new article with the OpenCL drivers updated ? :)
  • phenom90
    i would never buy a titan that costs $1k... i would rather choose hd 7970 instead if i'm going to purchase a new card.. a $400 card vs $1k card... only idiot will buy a card that $600 more expensive to gaming on single 1080p.. and since amd began to put more efforts on their driver support.. and hopefully their driver will turn out to be as polished as nvidia.. i may try radeon in my next purchase...
  • Wisecracker
    Who will be the first, brave enough to put **Titan** in their sigs?
  • dscudella
    122690 said:
    That's actually my problem... it's so overpriced that it's not even good as an e-peen card. The 690 is still the better e-peen card.

    Even after what, 1 year, your XFire 7970's are still the best looking setup.
  • ilysaml
    WisecrackerWho will be the first, brave enough to put **Titan** in their sigs?

  • Hazle
    Titan? don't you mean a(n overpriced) GTX 680Ti?

    aofjaxI wonder if I can build a whole new $1000 rig that can match/exceed a single Titan.....

    probably not (for now), but you'd get a better value out of it.