Skip to main content

GeForce GTX Titan X Review: Can One GPU Handle 4K?

How We Tested Nvidia’s GeForce GTX Titan X

We recently standardized our testing platform across Tom’s Hardware editors and offices, locking down consistent specifications for 2015. As such, all of today’s benchmarks are run on an Intel Core i7-5930K processor complemented by 16GB of DDR4-2400 memory. Stepping up to 500GB SSDs helps accommodate large benchmark suites, while our 850W power supply offers ample headroom to test even AMD’s Radeon R9 295X2.

Nvidia’s graphics cards are tested using driver build 347.84, while AMD’s are measured on the 14.12 Omega update posted in December of 2014. A few days before this review was scheduled to go live, AMD let us know it would introduce a FreeSync-enabled driver on March 19th, which also included long-overdue CrossFire profiles for Evolve and Far Cry 4. After spot-testing our other benchmarks to confirm their accuracy, we re-tested Far Cry using the upcoming build.

This was a more prominent issue back when AMD launched its Radeon R9 290X, but we continue to pre-heat graphics cards with benchmark runs prior to recording our actual results. This is done to avoid non-representative clock rates on both companies’ hardware.

Test System

CPUIntel Core i7-5930K (Haswell-E), 3.5/3.7GHz, Six Cores, LGA 2011-v3, 15MB Shared L3 Cache, Hyper-Threading enabled
MotherboardMSI X99S Xpower AC (LGA 2011-v3), Intel X99 Express, BIOS v1.6
MemoryCrucial Ballistix DDR4-2400, 4 x 4GB, 1200MHz, CL 16-16-16-39 2T
GraphicsNvidia GeForce GTX Titan X1002MHz GPU, 12GB GDDR5 at 1753MHz (7012 MT/s)Nvidia GeForce GTX 9801126MHz GPU, 4GB GDDR5 at 1753MHz (7012MT/s)Nvidia GeForce GTX 780 Ti875MHz GPU, 3GB GDDR5 at 1753MHz (7012MT/s)Nvidia GeForce GTX Titan837MHz GPU, 6GB GDDR5 at 1502MHz (6008MT/s)AMD Radeon R9 295X21018MHz GPU, 2 x 4GB GDDR5 at 1250MHz (5000MT/s)AMD Radeon R9 290X1000MHz GPU, 4GB GDDR5 at 1250MHz (5000MT/s)
SSDCrucial MX200, 500GB SSD, SATA 6Gb/s
PowerBe Quiet! Dark Power Pro 10, 850W, ATX12V, EPS12V

Software And Drivers

Operating SystemMicrosoft Windows 8 Pro x64
DirectXDirectX 11
Graphics DriversAll GeForce Cards: Nvidia 347.25 Beta DriverAll Radeon Cards: AMD Catalyst Omega 14.12

Benchmarks

Middle-earth: Shadow of MordorBuilt-in benchmark, Ultra preset
Battlefield 4Custom THG Benchmark, Ultra preset
Metro Last LightBuilt-in benchmark, Very High preset, 16x AF, Normal motion blur
ThiefVersion 1.7, Built-in benchmark, Very High preset
Tomb RaiderVersion 1.01.748.0, Built-in benchmark, Ultimate preset
Far Cry 4Version 1.9.0, Custom THG benchmark, 60-sec Fraps, Ultra preset
  • Yuka
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!
    Reply
  • chiefpiggy
    The R9 295x2 beats the Titan in almost every benchmark, and it's almost half the price.. I know the Titan X is just one gpu but the numbers don't lie nvidia. And nvidia fanboys can just let the salt flow through your veins that a previous generation card(s) can beat their newest and most powerful card. Cant wait for the 3xx series to smash the nvidia 9xx series
    Reply
  • chiefpiggy
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    Reply
  • rolli59
    Would be interesting to comparison with cards like 970 and R9 290 in dual card setups, basically performance for money.
    Reply
  • esrever
    Performance is pretty much expected from the leaked specs. Not bad performance but terrible price, as with all titans.
    Reply
  • dstarr3
    I don't know. I have a GTX770 right now, and I really don't think there's any reason to upgrade until we have cards that can average 60fps at 4K. And... that's unfortunately not this.
    Reply
  • hannibal
    Well this is actually cheaper than I expected. Interesting card and would really benefit for less heat... The Throttling is really the limiting factor in here.
    But yeah, this is expensive for its power as Titans always have been, but it is not out of reach neither. We need 14 to 16nm finvet GPU to make really good 4K graphic cards!
    Maybe in the next year...
    Reply
  • cst1992
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
    Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
    Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
    Reply
  • skit75
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD

    You're surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around.

    Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.
    Reply
  • negevasaf
    IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked
    Reply