Page 2:How We Tested Nvidia’s GeForce GTX Titan X
Page 3:Results: Battlefield 4, Far Cry 4 And Metro Last Light
Page 4:Results: Middle-earth: Shadow of Mordor, Thief And Tomb Raider
Page 5:Results: Power Consumption
Page 6:Results: Temperature, GPU Boost And Noise
Page 7:GeForce GTX Titan X And G-Sync At 4K
GeForce GTX Titan X And G-Sync At 4K
I co-authored our first look at G-Sync alongside Filippo Scognamiglio in G-Sync Technology Preview: Quite Literally A Game Changer. In that piece, we discussed the issues with conventional v-sync, introduced Nvidia’s approach to variable refresh and shared our experiences with the first G-Sync-capable monitor. During the initial barrage of questions Filippo and I lobbed at Nvidia, we debated the technology’s value in 120 and 144Hz monitors. G-Sync would shine brightest, we determined, between 30 and 60 FPS, where you might want v-sync turned on to mitigate tearing, but then be subject to stuttering as the output shifted between 60 and 30Hz.
Incidentally, that’s the range most of our GeForce GTX Titan X benchmarks fell into at 3840x2160 with detail settings as high as they’d go. Could G-Sync be the technology needed for this specific card at that exact resolution?
Nvidia sent over Acer’s XB280HK bprz, currently the only G-Sync-enabled 4K screen you can buy. The 28” display is on sale for $750 over at Newegg, making it a reasonable (or even affordable) pairing to the equally niche GeForce GTX Titan X. Christian Eberle will handle our review of the XB280HK. But I do want to mention that this particular sample showed up with a dead pixel. Then again, so did our Asus PQ321, which sold for more than four times as much.
G-Sync And 4K, In Practice
If you remember back to our previously-linked preview, we clarified that G-Sync is a quality feature; it doesn’t affect performance. So, the benchmark results you just saw persist through this experiment since we’re leaving v-sync off and enabling G-Sync.
Far Cry 4 is the first title I wanted to look at. Its lush outdoor environment makes tearing so obvious with v-sync off (the trees are particularly susceptible). Our performance data shows frame rates between 33 and 51, averaging just under 40 FPS. Were you to turn on v-sync, you’d see 30 FPS and experience some degree of latency. With v-sync off, the tearing between frames is unmistakable. G-Sync alleviates the tearing and gives you back that performance beyond 30 FPS in the range between 30 and 60 frames per second. Awesome. But while the technology sounds like a magic bullet, you’re still looking at dips down to 33 FPS. G-Sync doesn’t add or interpolate between frames. It simply improves perceived smoothness. Because of the way it plays, Far Cry 4 would benefit from a less demanding detail level or some additional rendering horsepower.
This is what tearing looks like in a tree-filled environment (Crysis 3)
How about Battlefield 4? We reported between 27 and 48 FPS through our benchmark, averaging a similar 39 frames per second on one GeForce GTX Titan X at 3840x2160. Even though this title is also fairly fast-paced, the action feels so much smoother with G-Sync enabled than our frame rate suggests.
The same goes for Metro Last Light—one of the games I was most excited about when we applied G-Sync to it back in 2013. The technology irons out the severe tearing you’d normally see strafing through narrow halls, even averaging 39 FPS. Middle-earth, Thief, Tomb Raider, Crysis 3—they all benefit more from higher frame rates than turning v-sync on, while eliminating the tearing you’d see after turning v-sync off.
Whether or not you consider the resulting experience enjoyable, though, is a matter of personal opinion. My own take is this: GeForce GTX Titan X is the first single-GPU card capable of playable numbers in a majority of games at 3840x2160 and maxed-out detail settings. Adding G-Sync neutralizes the artifacts that crop up as you choose between enabling or disabling v-sync in Titan X’s performance range. Really, the technology couldn’t have become available in a 4K panel at a better time.