GeForce GTX Titan X Review: Can One GPU Handle 4K?

Results: Battlefield 4, Far Cry 4 And Metro Last Light

Battlefield 4

Naturally, we didn’t have time to test Hardline before the Titan X’s launch, so Battlefield 4 stands in once more for the Frostbite engine.

Every card on this chart can play Battlefield 4 smoothly at 2560x1440—QHD is hardly a challenge, even with the detail preset dialed to Ultra. Still, a 30% victory over GeForce GTX 980 is impressive for Titan X, particularly after we were just impressed by GM204 a few months back.

Radeon R9 295X2 sweeps in with a victory overall though, yielding impressive performance for less money than any of Nvidia’s cards. Notably missing is GeForce GTX Titan Z—a board that was outperformed by AMD’s dual-GPU solution.

The GeForce GTX 780 Ti, Titan, and 980 are all on the slow side for a smooth experience at 3840x2160. So too does AMD’s Radeon R9 290X lag behind a bit under the same taxing Ultra preset.

Averaging 40 FPS (and dipping down to about 27), Nvidia’s GeForce GTX Titan X shows up on the wrong side of most enthusiasts’ 60 FPS target. However, after playing through a significant portion of the single-player campaign to test G-Sync, I can assure you that the game is enjoyably fluid.

Again, AMD’s Radeon R9 295X2 does deliver a 27%-better result at lower cost. You’d simply have to be alright with its higher power requirement, bulky closed-loop water cooling solution and dependence on CrossFire profiles to favor it in a gaming build.

Far Cry 4

On paper, the Radeon R9 295X2 enjoys a commanding 32% advantage over Nvidia’s GeForce GTX Titan X in Far Cry 4 at 2560x1440 using the Ultra quality preset. But a look at frame rate over time shows that card’s lead to be sporadic. At times, it’s actually slower than the more consistent GeForce GTX Titan X.

Meanwhile, the Titan X leads Nvidia’s own GeForce GTX 980 by 30%, surpassing the original Titan by an even more astounding 74%.

Even on high-end hardware at relatively modest settings, Far Cry 4 just doesn’t run like a mature first-person shooter. The stereotypical reaction to this sort of control interface and jerky performance is, “ugh, console port.” And despite similar average frame rates as Battlefield 4, the Far Cry 4 experience doesn’t feel as smooth.

That’s bad news at 3840x2160. The Radeon R9 295X2 does enjoy another technical win, but again starts the benchmark with frame rates under the Titan X, touching Radeon R9 290X territory. Perhaps this is related to the fact that AMD still hasn’t officially released a CrossFire profile for Far Cry 4; it’ll make its debut in the driver we’re testing in a couple of days.

Metro Last Light

The Radeon R9 295X2 posts a great benchmark run, beating GeForce GTX Titan X by 29%. Really though, all of the cards we’re testing demonstrate playable frame rates under Metro’s Very High detail setting. QHD isn’t supposed to be an obstacle for these high-end cards, though. Ultra HD is where the real glory lies.

The performance numbers don’t change much at 3840x2160 in Metro Last Light. The GeForce GTX 780 Ti, Titan, and 980 all average around 30 FPS, while Nvidia’s new Titan X comes closer to 40 FPS. The Radeon R9 290X fares admirably given its current price, but again shouldn’t be considered a 4K-capable card if you’re looking for maxed-out settings. AMD’s dual-GPU flagship gets us close to 60 FPS, beating the Titan X by 41%, at a price point below what Nvidia’s asking.

This thread is closed for comments
361 comments
    Your comment
  • Yuka
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!
  • chiefpiggy
    The R9 295x2 beats the Titan in almost every benchmark, and it's almost half the price.. I know the Titan X is just one gpu but the numbers don't lie nvidia. And nvidia fanboys can just let the salt flow through your veins that a previous generation card(s) can beat their newest and most powerful card. Cant wait for the 3xx series to smash the nvidia 9xx series
  • chiefpiggy
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!

    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
  • rolli59
    Would be interesting to comparison with cards like 970 and R9 290 in dual card setups, basically performance for money.
  • esrever
    Performance is pretty much expected from the leaked specs. Not bad performance but terrible price, as with all titans.
  • dstarr3
    I don't know. I have a GTX770 right now, and I really don't think there's any reason to upgrade until we have cards that can average 60fps at 4K. And... that's unfortunately not this.
  • hannibal
    Well this is actually cheaper than I expected. Interesting card and would really benefit for less heat... The Throttling is really the limiting factor in here.
    But yeah, this is expensive for its power as Titans always have been, but it is not out of reach neither. We need 14 to 16nm finvet GPU to make really good 4K graphic cards!
    Maybe in the next year...
  • cst1992
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
    Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
    Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
  • skit75
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD


    You're surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around.

    Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.
  • negevasaf
    IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked
  • chiefpiggy
    1472755 said:
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison. Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then. Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).


    What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.
  • giovanni86
    I was hoping for far better results. Though priced at $1k may seem worth wild, will be waiting to see if EVGA releases something.
  • chiefpiggy
    192459 said:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around. Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.


    Just because it's one gpu doesn't mean people should pay twice for less performance. If you can't see that than I honestly do not understand... And they're supposed ace in the hole is the 12 GB of vram and G-Sync performance, but for a "4k" card I for one and not impressed
  • chiefpiggy
    1490340 said:
    IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked


    It's completely up to you whether or not to believe the rumors that come out over time, but I would usually just wait until the actual card to come out and then compare the benchmarks :p

    Unless of course we're talking about the GTX 970 scandal
  • backoffmanImascientist
    Quote:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around. Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.


    Single (and Crossfired) 295X2 vs 2 GTX Titans in SLI coming right up, read and weep:

    http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-4.html
  • Amdlova
    nothing to see here. another epic fail
  • photonboy
    SLI 2xTitan X + GSYNC.

    If money was not an issue that's what I would do.

    *And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.

    I don't complain because my FERRARI wasn't a good value.
  • 10tacle
    I've bought Nvidia cards for the last 5 years, my last AMD card being a 5770. However, I have a bad taste in my mouth left with the 970 snafu (although currently I'm okay with the performance of mine at 1440p...for *now*...but I bought it for tomorrow too). Between that and this less-than-stellar result of this new Titan, I'm just not getting the warm and fuzzy feeling of confidence I used to with Nvidia. And who knows if THIS card's specs are truly correct, huh?? Depending on what AMD trots out with the new Radeon 3xx series, I just may be switching back to the Red Team as I make the move to 4K next year.
  • Cash091
    Quote:
    1472755 said:
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison. Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then. Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
    What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.


    It totally comes down to a performance per dollar thing. I'm shocked that with the 295x2 beating this in benches, they went with such a high price tag. $700 would have been a decent, yet high, price point for this card. I can see the appeal of this card, but the 295x2 outshines it. As the article states, the only people who want this are ones who don't have room to cool the 295x2 in their cases. What would be interesting to see, is 2 of these vs. 2 295x2's(or 290x/295x2)!
  • TechyInAZ
    Impressive at 4k resolutions. I thought this card would run for $1500, glad it's only $1000 (even though I can almost guarantee you I won't buy it, even if I had the money).

    I wish they did at least 2 way SLI tests, that would of been fun.

    Any chance of trying three 4k monitors with a 3 or 4 way SLI titan X config?

    I prefer the silver finish on the regular titan and titan Z also, pure black doesn't look that great. :)
  • chiefpiggy
    67821 said:
    SLI 2xTitan X + GSYNC. If money was not an issue that's what I would do. *And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it. I don't complain because my FERRARI wasn't a good value.

    As companies strive for perfection, they should make their cards the most appealing to all audiences as they can.

    In your case, you "own a Ferrari." When someone buys a Ferrari (or any car, house, etc.), it is assumed that they are in for a more permanent investment. New graphics cards, by nature, are released every year-year and a half, causing new graphics cards to become outdated and obsolete within 3-5 years after being released. When someone gets a Ferrari, it's assumed that the car will retain most, if not all of its value after being purchased. Whereas graphics cards are concerned, they lose value relatively quickly.
  • nikoli707
    lol at some of you complaining about the cost of a professional card designed for double floating point numbers. it cost the same as the last titan… you cannot find a keplar titan for under $500 even now.

    otherwise its a beast like we knew it would be, even at a paltry 1000mhz? show us the classified at 1400mhz.
  • Blueberries
    Putting a water loop on a Titan X could potentially be a real workstation option, especially if you're willing to wipe your vbios.
  • skit75
    1912439 said:
    Quote:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around. Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.
    Single (and Crossfired) 295X2 vs 2 GTX Titans in SLI coming right up, read and weep: http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-4.html


    ROFL! Weeping from laughter.

    Beatrice... is that you? "That is not how this works... that's not how any of this works!"