GeForce GTX Titan X Review: Can One GPU Handle 4K?

GeForce GTX Titan X And G-Sync At 4K

I co-authored our first look at G-Sync alongside Filippo Scognamiglio in G-Sync Technology Preview: Quite Literally A Game Changer. In that piece, we discussed the issues with conventional v-sync, introduced Nvidia’s approach to variable refresh and shared our experiences with the first G-Sync-capable monitor. During the initial barrage of questions Filippo and I lobbed at Nvidia, we debated the technology’s value in 120 and 144Hz monitors. G-Sync would shine brightest, we determined, between 30 and 60 FPS, where you might want v-sync turned on to mitigate tearing, but then be subject to stuttering as the output shifted between 60 and 30Hz.  

Incidentally, that’s the range most of our GeForce GTX Titan X benchmarks fell into at 3840x2160 with detail settings as high as they’d go. Could G-Sync be the technology needed for this specific card at that exact resolution?

Nvidia sent over Acer’s XB280HK bprz, currently the only G-Sync-enabled 4K screen you can buy. The 28” display is on sale for $750 over at Newegg, making it a reasonable (or even affordable) pairing to the equally niche GeForce GTX Titan X. Christian Eberle will handle our review of the XB280HK. But I do want to mention that this particular sample showed up with a dead pixel. Then again, so did our Asus PQ321, which sold for more than four times as much.

G-Sync And 4K, In Practice

If you remember back to our previously-linked preview, we clarified that G-Sync is a quality feature; it doesn’t affect performance. So, the benchmark results you just saw persist through this experiment since we’re leaving v-sync off and enabling G-Sync.

Far Cry 4 is the first title I wanted to look at. Its lush outdoor environment makes tearing so obvious with v-sync off (the trees are particularly susceptible). Our performance data shows frame rates between 33 and 51, averaging just under 40 FPS. Were you to turn on v-sync, you’d see 30 FPS and experience some degree of latency. With v-sync off, the tearing between frames is unmistakable. G-Sync alleviates the tearing and gives you back that performance beyond 30 FPS in the range between 30 and 60 frames per second. Awesome. But while the technology sounds like a magic bullet, you’re still looking at dips down to 33 FPS. G-Sync doesn’t add or interpolate between frames. It simply improves perceived smoothness. Because of the way it plays, Far Cry 4 would benefit from a less demanding detail level or some additional rendering horsepower.

This is what tearing looks like in a tree-filled environment (Crysis 3)This is what tearing looks like in a tree-filled environment (Crysis 3)

How about Battlefield 4? We reported between 27 and 48 FPS through our benchmark, averaging a similar 39 frames per second on one GeForce GTX Titan X at 3840x2160. Even though this title is also fairly fast-paced, the action feels so much smoother with G-Sync enabled than our frame rate suggests.

The same goes for Metro Last Light—one of the games I was most excited about when we applied G-Sync to it back in 2013. The technology irons out the severe tearing you’d normally see strafing through narrow halls, even averaging 39 FPS. Middle-earth, Thief, Tomb Raider, Crysis 3—they all benefit more from higher frame rates than turning v-sync on, while eliminating the tearing you’d see after turning v-sync off.

Whether or not you consider the resulting experience enjoyable, though, is a matter of personal opinion. My own take is this: GeForce GTX Titan X is the first single-GPU card capable of playable numbers in a majority of games at 3840x2160 and maxed-out detail settings. Adding G-Sync neutralizes the artifacts that crop up as you choose between enabling or disabling v-sync in Titan X’s performance range. Really, the technology couldn’t have become available in a 4K panel at a better time.

This thread is closed for comments
361 comments
    Your comment
  • Yuka
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha.

    I'm impressed by their shenanigans. They up themselves each time.

    In any case, at least this card looks fine for compute.

    Cheers!
  • chiefpiggy
    The R9 295x2 beats the Titan in almost every benchmark, and it's almost half the price.. I know the Titan X is just one gpu but the numbers don't lie nvidia. And nvidia fanboys can just let the salt flow through your veins that a previous generation card(s) can beat their newest and most powerful card. Cant wait for the 3xx series to smash the nvidia 9xx series
  • chiefpiggy
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!

    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
  • rolli59
    Would be interesting to comparison with cards like 970 and R9 290 in dual card setups, basically performance for money.
  • esrever
    Performance is pretty much expected from the leaked specs. Not bad performance but terrible price, as with all titans.
  • dstarr3
    I don't know. I have a GTX770 right now, and I really don't think there's any reason to upgrade until we have cards that can average 60fps at 4K. And... that's unfortunately not this.
  • hannibal
    Well this is actually cheaper than I expected. Interesting card and would really benefit for less heat... The Throttling is really the limiting factor in here.
    But yeah, this is expensive for its power as Titans always have been, but it is not out of reach neither. We need 14 to 16nm finvet GPU to make really good 4K graphic cards!
    Maybe in the next year...
  • cst1992
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
    Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
    Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
  • skit75
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD


    You're surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around.

    Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.
  • negevasaf
    IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked
  • chiefpiggy
    1472755 said:
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison. Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then. Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).


    What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.
  • giovanni86
    I was hoping for far better results. Though priced at $1k may seem worth wild, will be waiting to see if EVGA releases something.
  • chiefpiggy
    192459 said:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around. Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.


    Just because it's one gpu doesn't mean people should pay twice for less performance. If you can't see that than I honestly do not understand... And they're supposed ace in the hole is the 12 GB of vram and G-Sync performance, but for a "4k" card I for one and not impressed
  • chiefpiggy
    1490340 said:
    IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked


    It's completely up to you whether or not to believe the rumors that come out over time, but I would usually just wait until the actual card to come out and then compare the benchmarks :p

    Unless of course we're talking about the GTX 970 scandal
  • backoffmanImascientist
    Quote:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around. Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.


    Single (and Crossfired) 295X2 vs 2 GTX Titans in SLI coming right up, read and weep:

    http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-4.html
  • Amdlova
    nothing to see here. another epic fail
  • photonboy
    SLI 2xTitan X + GSYNC.

    If money was not an issue that's what I would do.

    *And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.

    I don't complain because my FERRARI wasn't a good value.
  • 10tacle
    I've bought Nvidia cards for the last 5 years, my last AMD card being a 5770. However, I have a bad taste in my mouth left with the 970 snafu (although currently I'm okay with the performance of mine at 1440p...for *now*...but I bought it for tomorrow too). Between that and this less-than-stellar result of this new Titan, I'm just not getting the warm and fuzzy feeling of confidence I used to with Nvidia. And who knows if THIS card's specs are truly correct, huh?? Depending on what AMD trots out with the new Radeon 3xx series, I just may be switching back to the Red Team as I make the move to 4K next year.
  • Cash091
    Quote:
    1472755 said:
    People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison. Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then. Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
    What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.


    It totally comes down to a performance per dollar thing. I'm shocked that with the 295x2 beating this in benches, they went with such a high price tag. $700 would have been a decent, yet high, price point for this card. I can see the appeal of this card, but the 295x2 outshines it. As the article states, the only people who want this are ones who don't have room to cool the 295x2 in their cases. What would be interesting to see, is 2 of these vs. 2 295x2's(or 290x/295x2)!
  • TechyInAZ
    Impressive at 4k resolutions. I thought this card would run for $1500, glad it's only $1000 (even though I can almost guarantee you I won't buy it, even if I had the money).

    I wish they did at least 2 way SLI tests, that would of been fun.

    Any chance of trying three 4k monitors with a 3 or 4 way SLI titan X config?

    I prefer the silver finish on the regular titan and titan Z also, pure black doesn't look that great. :)
  • chiefpiggy
    67821 said:
    SLI 2xTitan X + GSYNC. If money was not an issue that's what I would do. *And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it. I don't complain because my FERRARI wasn't a good value.

    As companies strive for perfection, they should make their cards the most appealing to all audiences as they can.

    In your case, you "own a Ferrari." When someone buys a Ferrari (or any car, house, etc.), it is assumed that they are in for a more permanent investment. New graphics cards, by nature, are released every year-year and a half, causing new graphics cards to become outdated and obsolete within 3-5 years after being released. When someone gets a Ferrari, it's assumed that the car will retain most, if not all of its value after being purchased. Whereas graphics cards are concerned, they lose value relatively quickly.
  • nikoli707
    lol at some of you complaining about the cost of a professional card designed for double floating point numbers. it cost the same as the last titan… you cannot find a keplar titan for under $500 even now.

    otherwise its a beast like we knew it would be, even at a paltry 1000mhz? show us the classified at 1400mhz.
  • Blueberries
    Putting a water loop on a Titan X could potentially be a real workstation option, especially if you're willing to wipe your vbios.
  • skit75
    1912439 said:
    Quote:
    Quote:
    Quote:
    Interesting move by nVidia to send a G-Sync monitor... So to trade off the lackluster performance over the GTX980, they wanted to cover it up with a "smooth experience", huh? hahaha. I'm impressed by their shenanigans. They up themselves each time. In any case, at least this card looks fine for compute. Cheers!
    Paying almost double for a 30% increase in performance??? Shenanigans alright xD
    Your surprised? Early adopters always pay the premium. I find it interesting you mention "almost every benchmark" when comparing this GPU to a dual GPU of last generation. Sounds impressive on a purely performance measure. I am not a fan of SLI but I suspect two of these would trounce anything around. Either way the card is way out of my market but now that another card has taken top honors, maybe it will bleed the 970/980 prices down a little into my cheapskate hands.
    Single (and Crossfired) 295X2 vs 2 GTX Titans in SLI coming right up, read and weep: http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-4.html


    ROFL! Weeping from laughter.

    Beatrice... is that you? "That is not how this works... that's not how any of this works!"