GeForce GTX Titan X Review: Can One GPU Handle 4K?

Confidence is sexy, I’m told.

Nvidia showed serious confidence in GeForce GTX Titan X when the company shipped over one card and one monitor—an Acer XB280HK bprz—the first G-Sync-equipped 4K display. Does Nvidia expect its new single-GPU flagship to drive a 3840x2160 resolution with performance to spare? How cool would that be? My little Tiki, tucked out of sight, would plow through the most taxing games at Ultra HD. Oh lawdy. And here I thought I’d be stuck at 2560x1440 forever.

The 980 didn’t get us to a point where one graphics processor could handle the demands of 4K at enthusiast-class detail levels. But the GeForce GTX Titan X is based on GM200, composed of eight billion transistors. Eight. Billion. That’s almost three billion more than the GeForce GTX 980’s GM204 and one billion more than GK110, the original GeForce GTX Titan’s cornerstone.

In their final form, those transistors manifest as a 601mm² piece of silicon, which is about nine percent larger than GK110. The GM200 GPU is manufactured on the same 28nm high-k metal gate process, so this is simply a larger, more complex chip. And yet it’s rated for the same 250W maximum board power. With higher clock rates. And twice as much memory. Superficially, the math seems off. But it isn’t.

Inside Of GM200

Like the GeForce GTX 980 and 970 we were introduced to last September, GM200 is based on Nvidia’s efficient Maxwell architecture. Instead of GM204’s four Graphics Processing Clusters, you get six. And with four Streaming Multiprocessors per GPC, that adds up to 24 SMMs across the GPU. Multiply out the 128 CUDA cores per SMM and you get GeForce GTX Titan X’s total of 3072. Eight texture units per SMM add up to 192—with a base core clock rate of 1000MHz, that’s 192 GTex/s (the original GeForce GTX Titan was rated at 188, despite its higher texture unit count).

Like the SMMs found in GM204, GM200 exposes 96KB of shared memory and 48KB of texture/L1 cache, doubling what it had in the GeForce GTX 750 Ti’s GM107. The other architectural elements are similar though; each SMM is broken up into four blocks, with their own instruction buffer, warp scheduler and pair of dispatch units. In fact, so much is carried over that double-precision math is still specified at 1/32 the rate of FP32, even though GM200 is the Maxwell family’s big daddy. Incidentally, an upcoming Quadro card based on the same GPU shares this fate. If FP64 performance is truly important to you, Nvidia would likely suggest one of its Tesla boards.

GeForce GTX 980’s four ROP partitions grow to six in GeForce GTX Titan X. With 16 units each, that’s up to 96 32-bit integer pixels per clock. The ROP partitions are aligned with 512KB slices of L2 cache, totaling 3MB in GM200. When it introduced GeForce GTX 750 Ti, Nvidia talked about a big L2 as a mechanism for preventing bottlenecks on a relatively narrow 128-bit memory interface. That’s not as big of a concern with GM200, given its 384-bit path populated by 7 Gb/s memory. Maximum throughput of 336.5 GB/s matches the GeForce GTX 780 Ti, and exceeds GeForce GTX Titan, GeForce GTX 980 and Radeon R9 290X.

On-Board GeForce GTX Titan X

Nvidia drops GM200 on a 10.5”-long PCB that resembles the high-end boards we’ve been seeing for more than two years now. Model-specific differences are apparent when you look more closely, but we’re talking about the same dimensions here, which undoubtedly makes life easier for system builders who were nervous about integration.

The same number of single-die memory ICs surround the GPU. However, now we’re looking at 4Gb (512MB) packages of SK hynix’s fastest GDDR5, totaling 12GB. Even for 4K, that’s really quite overkill. Still, Nvidia says it’s going for future-proofing, and if there’s a future where Ultra HD displays in Surround are driven by three or four Titan X boards in SLI, 6GB wouldn’t have been enough.

A plate sits on top of the PCB, cooling a number of the surface-mounted components. There’s a copper vapor chamber mounted to that, topped by a two-slot-tall aluminum heat sink. Nvidia’s reference design remains faithful to the centrifugal fan, which pulls in air from your chassis, pushes it over the plate, through the heat sink and out the back. Although blower-style fans tend to create more noise than axial coolers, we’ve seen enough cards based on this same ID to know they’re acoustically-friendly. GeForce GTX Titan X is no exception.

An aluminum housing covers the internals. It’s more textured than we’ve seen in the past, and Nvidia paints the enclosure black. Do you remember this picture from The Story of How GeForce GTX 690 And Titan Came To Be? It’s sort of like that, except without green lighting under the fins.

Also, the backplate present on GeForce GTX 980 is missing. Although part of the plate was removable to augment airflow in arrays of multiple 980s, Titan X is a more power-hungry card. To give adjacent boards as much breathing room as possible, Nvidia got rid of it entirely. I don’t personally mind the omission, but Igor was fond of the more polished look.

GeForce GTX Titan shares the 980’s display outputs, including one dual-link DVI port, one HDMI 2.0-capable connector and three full-sized DisplayPort interfaces. Between those five options, you can drive as many as four displays at a time. And if you’re using G-Sync-enabled displays, that trio of DisplayPort 1.2 outputs makes Surround a viable choice.

A Mind To The Future

Beyond the Titan X’s 12GB of GDDR5 memory, Nvidia calls out a number of the GPU’s features said to make it more future-proof.

During this year’s GDC, Microsoft mentioned that 50% of today’s graphics hardware is covered by DirectX 12, and by the holiday season, two-thirds of the market will be compatible. That means a lot of graphics cards are going to work with the API. But there will be different feature levels, which group together DirectX 12’s features: 12.0 and 12.1. According to Microsoft’s Max McMullen, 12.0 exposes a lot of the API’s CPU-oriented performance advantages, while 12.1 adds Conservative Rasterization and Rasterizer Ordered Views (ROVs) for more powerful rendering algorithms.

As you might expect from a GPU said to be built with the future in mind, GM200 supports feature level 12.1 (as does GM204). Everything older, including the GM107 found in GeForce GTX 750 Ti, is limited to feature level 12.0. We also asked AMD whether its GCN-based processors support 12.0 or 12.1; a representative said he could not comment at this time.

The Maxwell architecture enables a number of other features, some of which are exploitable today, while others require developer support to expose. For more information about Dynamic Super Resolution, Multi-Frame Samples Anti-Aliasing (a great way to diminish the performance impact of AA at 4K on Titan X, by the way), VR Direct and Voxel Global Illumination, check out this page from Don’s GeForce GTX 980 review.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
361 comments
    Your comment
    Top Comments
  • photonboy
    SLI 2xTitan X + GSYNC.

    If money was not an issue that's what I would do.

    *And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.

    I don't complain because my FERRARI wasn't a good value.
    26
  • esrever
    Performance is pretty much expected from the leaked specs. Not bad performance but terrible price, as with all titans.
    26
  • dstarr3
    I don't know. I have a GTX770 right now, and I really don't think there's any reason to upgrade until we have cards that can average 60fps at 4K. And... that's unfortunately not this.
    10
  • Other Comments
  • Yuka
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!
    -20
  • chiefpiggy
    The R9 295x2 beats the Titan in almost every benchmark, and it's almost half the price.. I know the Titan X is just one gpu but the numbers don't lie nvidia. And nvidia fanboys can just let the salt flow through your veins that a previous generation card(s) can beat their newest and most powerful card. Cant wait for the 3xx series to smash the nvidia 9xx series
    4
  • chiefpiggy
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!

    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    6
  • rolli59
    Would be interesting to comparison with cards like 970 and R9 290 in dual card setups, basically performance for money.
    5
  • esrever
    Performance is pretty much expected from the leaked specs. Not bad performance but terrible price, as with all titans.
    26
  • dstarr3
    I don't know. I have a GTX770 right now, and I really don't think there's any reason to upgrade until we have cards that can average 60fps at 4K. And... that's unfortunately not this.
    10
  • hannibal
    Well this is actually cheaper than I expected. Interesting card and would really benefit for less heat... The Throttling is really the limiting factor in here.
    But yeah, this is expensive for its power as Titans always have been, but it is not out of reach neither. We need 14 to 16nm finvet GPU to make really good 4K graphic cards!
    Maybe in the next year...
    2
  • cst1992
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
    Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
    Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
    -1
  • skit75
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!

    Paying almost double for a 30% increase in performance??? Shenanigans alright xD


    You're surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around.

    Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.
    6
  • negevasaf
    IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked
    6
  • chiefpiggy
    Anonymous said:
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
    Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
    Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).


    What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.
    2
  • giovanni86
    I was hoping for far better results. Though priced at $1k may seem worth wild, will be waiting to see if EVGA releases something.
    -1
  • chiefpiggy
    Anonymous said:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!

    Paying almost double for a 30% increase in performance??? Shenanigans alright xD


    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around.

    Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.


    Just because it's one gpu doesn't mean people should pay twice for less performance. If you can't see that than I honestly do not understand... And they're supposed ace in the hole is the 12 GB of vram and G-Sync performance, but for a "4k" card I for one and not impressed
    3
  • chiefpiggy
    Anonymous said:
    IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked


    It's completely up to you whether or not to believe the rumors that come out over time, but I would usually just wait until the actual card to come out and then compare the benchmarks :p

    Unless of course we're talking about the GTX 970 scandal
    7
  • backoffmanImascientist
    Quote:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!

    Paying almost double for a 30% increase in performance??? Shenanigans alright xD


    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around.

    Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.


    Single (and Crossfired) 295X2 vs 2 GTX Titans in SLI coming right up, read and weep:

    http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-4.html
    -5
  • Amdlova
    nothing to see here. another epic fail
    2
  • photonboy
    SLI 2xTitan X + GSYNC.

    If money was not an issue that's what I would do.

    *And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.

    I don't complain because my FERRARI wasn't a good value.
    26
  • 10tacle
    I've bought Nvidia cards for the last 5 years, my last AMD card being a 5770. However, I have a bad taste in my mouth left with the 970 snafu (although currently I'm okay with the performance of mine at 1440p...for *now*...but I bought it for tomorrow too). Between that and this less-than-stellar result of this new Titan, I'm just not getting the warm and fuzzy feeling of confidence I used to with Nvidia. And who knows if THIS card's specs are truly correct, huh?? Depending on what AMD trots out with the new Radeon 3xx series, I just may be switching back to the Red Team as I make the move to 4K next year.
    6
  • Cash091
    Quote:
    Anonymous said:
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
    Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
    Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).


    What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.


    It totally comes down to a performance per dollar thing. I'm shocked that with the 295x2 beating this in benches, they went with such a high price tag. $700 would have been a decent, yet high, price point for this card. I can see the appeal of this card, but the 295x2 outshines it. As the article states, the only people who want this are ones who don't have room to cool the 295x2 in their cases. What would be interesting to see, is 2 of these vs. 2 295x2's(or 290x/295x2)!
    4
  • TechyInAZ
    Impressive at 4k resolutions. I thought this card would run for $1500, glad it's only $1000 (even though I can almost guarantee you I won't buy it, even if I had the money).

    I wish they did at least 2 way SLI tests, that would of been fun.

    Any chance of trying three 4k monitors with a 3 or 4 way SLI titan X config?

    I prefer the silver finish on the regular titan and titan Z also, pure black doesn't look that great. :)
    0