Skip to main content

Nvidia GeForce GTX 780 Review: Titan’s Baby Brother Is Born

GeForce Experience And ShadowPlay

GeForce Experience

As PC enthusiasts, we appreciate the permutations of different quality- and performance-oriented settings available to us in the latest games. It’s actually validating to spend a bunch of money on a new graphics card and then start cranking up the graphics details to the highest levels. When there’s an option that pushes our hardware too far and remains unplayable, we’re irked knowing that the title can look that much better.

Dialing in the optimal settings isn’t easy, though. Some yield better visuals than others, while performance is impacted to varying degrees. GeForce Experience is Nvidia’s attempt to simplify choosing game settings by comparing your CPU, GPU, and resolution to a database of configurations. The other half of the utility makes sure your drivers stay up to date.

I still think that enthusiasts are going to resist running an additional piece of software to make these decisions for them. But more mainstream gamers inclined to install a title and fire it up immediately, without checking for drivers or tweaking its settings, naturally stand to gain quite a bit. Insofar as GeForce Experience helps those folks enjoy a more compelling experience, Nvidia’s software is good for PC gaming.

GeForce Experience detected all nine of the games installed on our test bed. Of course, none of them were still at their default settings, since they were set up for benchmarking. It’d be interesting, I thought, to see how Experience would change the options I dialed in myself.

For Tomb Raider, GeForce Experience wanted to disable TressFX, even though the GeForce GTX 780 averages in the mid-40 FPS range with it on. The utility wasn’t able to detect our Far Cry 3 configuration at all for some reason, though the settings it suggested were sufficiently high-end. In Skyrim, it wanted to turn off FXAA for some strange reason.

It’s very cool to get a set of screen shots for each game with descriptive boxes explaining what effect each setting has on image quality. From the nine examples I looked over, GeForce Experience did come close to the settings I would have selected. But it also biases to Nvidia-specific features like PhysX (which it cranks up to High in Borderlands 2) and against those introduced by AMD (including TressFX in Tomb Raider). Disabling FXAA in a game like Skyrim just doesn’t make sense given near-100 FPS average frame rates. It’ll likely be more important for enthusiasts to have GeForce Experience installed once Shield starts shipping; the Game Streaming feature appears to be enabled through Nvidia’s app.

Update, 5/28: Nvidia sent over some feedback indicating that TressFX is disabled by GeForce Experience using the configuration and resolution we selected in Tomb Raider because previous versions of the game achieved a significantly lower performance level. So, it seems there will be some delay between when drivers or patches dramatically improve frame rates, and when GeForce Experience takes those improvements into consideration. For now, I maintain that this is a mismatch in the settings an enthusiast would use compared to the automated system. In Skyrim, however, Nvidia clarifies that FXAA is not enabled by default because this feature is actually bad for image quality when used in conjunction with MSAA. Thus, the decision to keep FXAA turned off was deliberate, and not an oversight. Finally, the company is adamant it does not weigh its own technologies higher than the competition's. To illustrate, it points out that PhysX is disabled in Planetside 2, favoring other detail settings that contribute more to the experience. Specifically, "PhysX effects can improve the game play experience immensely (as seen in a game like Batman: Arkham City), but PhysX or TXAA are never given higher priority in GFE just because they’re Nvidia technologies." We appreciate Nvidia's feedback, which of course is the product of many more game titles than we have available here for testing. As GeForce Experience matures, we look forward to seeing how gamers at all levels react to it.

ShadowPlay: An Always-On DVR For Gaming

A long time ago, in a galaxy far away, I was pretty into WoW. Like, I raided four days a week, multiple hours a day. Any time our guild tackled a new boss, we’d record every attempt in the hopes we’d score a kill. When we did, one of us would post a video of it. Invariably, the people recording the video needed a decently high-end machine, Fraps, and plenty of storage space for all of those hours of wipes. 

I don’t do that anymore. But when Nvidia talked to me about its new ShadowPlay feature, raiding is what instantly came to mind.

Enabled, ShadowPlay leverages the NVEnc fixed-function encoder built into Kepler-based GPUs and automatically records the last 20 minutes of game play. Or, you can manually control when ShadowPlay starts and stops doing its thing. This replaces software-based solutions like Fraps, which exact a more demanding load on your host processor.

As a quick refresher from our GeForce GTX 680 launch story, NVEnc is limited to H.264 encodes at up 4096x4096. ShadowPlay is not yet available, but Nvidia says that when shows up later this summer, it’ll do 1080p recordings at up to 30 FPS. I’d like to see it capable of higher resolutions, given earlier claims that the encoder can handle them in hardware.

  • CrisisCauser
    A good alternative to the Titan. $650 was the original GTX 280 price before AMD came knocking with the Radeon 4870. I wonder if AMD has another surprise in store.
    Reply
  • gigantor21
    GG Titan.
    Reply
  • It's definitely a more reasonable priced alternative to the titan, but it's still lacking in compute. Which might disappoint some but I don't think it'll bother most people. Definitely not bad bang for buck at that price range considering how performance scales with higher priced products, but it could've been better, $550-$600 seems like a more reasonable price for this.
    Reply
  • hero1
    This is what I have been waiting for. Nice review and I like the multi gpu tests. Thanks. Time to search the stores. Woohoo!!
    Reply
  • natoco
    To much wasted silicon (just a failed high spec chip made last year, even the titan) and rebadged with all the failed sections turned off. I wanted to upgrade my gtx480 for a 780 but for the die size, the performance is to low unfortunately. It has certainly not hit the trifecta like the 680 did. Would you buy a V8 with 2 cylinders turned off even if it were cheaper? No, because it would not be as smooth as it was engineered to be, so using that analogy, No deal. customer lost till next year when they release a chip to the public that's all switched on, will never go down the turned off parts in chip route again.
    Reply
  • EzioAs
    In my opinion, this card and the Titan is actually a clever product release by Nvidia. Much like the GTX 680 and GTX 670, the Titan was released at higher price (like the GTX 680) while the slightly slower GTX 780 (the GTX670 for the GTX600 series case) is at a significantly lower price but performing quite close to it's higher-end brother. We all remember when the GTX 670 launched it makes the GTX680 looks bad because the GTX 670 was 80% of the price while maintaining around 90-95% of the performance.

    Of course, one could argue that as we get closer to higher-end products, the performance increase is always minimal and price to performance ratio starts to increase, however, for the past 3-4 years (or so I guess), never has it been that the 2nd highest-end GPU having such low performance difference with the highest-end GPU. It's usually significant enough that the highest end GPU (GTX x80) still has it's place.

    Tl;dr,

    The GTX Titan was released to make the GTX 780 look incredibly good, and people (especially on the internet), will spread the news fast enough claiming the $650 release price for the GTX 780 is good and reasonable, and people who didn't even bother reading reviews and benchmarks, will take their word and pay the premium for GTX 780.

    Nvidia is taking a different route to compete with AMD or one could say that they're not even trying to compete with AMD in terms of price/performance (at least for the high-end products).
    Reply
  • mouse24
    natocoTo much wasted silicon (just a failed high spec chip made last year, even the titan) and rebadged with all the failed sections turned off. I wanted to upgrade my gtx480 for a 780 but for the die size, the performance is to low unfortunately. It has certainly not hit the trifecta like the 680 did. Would you buy a V8 with 2 cylinders turned off even if it were cheaper? No, because it would not be as smooth as it was engineered to be, so using that analogy, No deal. customer lost till next year when they release a chip to the public that's all switched on, will never go down the turned off parts in chip route again.
    Thats apretty bad analogy. A gpu is still smooth even with some of the cores/vram/etc turned off, it doesn't increase latency/frametimes/etc.
    Reply
  • godfather666
    "But, I’m going to wait a week before deciding what I’d spend my money on in the high-end graphics market. "

    I must've missed something. Why wait a week?
    Reply
  • JamesSneed
    Natoco, your comment was so clueless. It is likely every single CPU or GPU you have ever purchased has fused off parts. Even the $1000 extreme Intel cpu has a little bit fused off since its a 6 core CPU but using a 8 core Zeon as its starting point. Your comparison to a car is idiotic.
    Reply
  • 016ive
    You will have to be an idiot to buy a Titan now that the 780 is here...Me, I could afford neither :)
    Reply