GeForce Experience
As PC enthusiasts, we appreciate the permutations of different quality- and performance-oriented settings available to us in the latest games. It’s actually validating to spend a bunch of money on a new graphics card and then start cranking up the graphics details to the highest levels. When there’s an option that pushes our hardware too far and remains unplayable, we’re irked knowing that the title can look that much better.
Dialing in the optimal settings isn’t easy, though. Some yield better visuals than others, while performance is impacted to varying degrees. GeForce Experience is Nvidia’s attempt to simplify choosing game settings by comparing your CPU, GPU, and resolution to a database of configurations. The other half of the utility makes sure your drivers stay up to date.
I still think that enthusiasts are going to resist running an additional piece of software to make these decisions for them. But more mainstream gamers inclined to install a title and fire it up immediately, without checking for drivers or tweaking its settings, naturally stand to gain quite a bit. Insofar as GeForce Experience helps those folks enjoy a more compelling experience, Nvidia’s software is good for PC gaming.
GeForce Experience detected all nine of the games installed on our test bed. Of course, none of them were still at their default settings, since they were set up for benchmarking. It’d be interesting, I thought, to see how Experience would change the options I dialed in myself.
For Tomb Raider, GeForce Experience wanted to disable TressFX, even though the GeForce GTX 780 averages in the mid-40 FPS range with it on. The utility wasn’t able to detect our Far Cry 3 configuration at all for some reason, though the settings it suggested were sufficiently high-end. In Skyrim, it wanted to turn off FXAA for some strange reason.
It’s very cool to get a set of screen shots for each game with descriptive boxes explaining what effect each setting has on image quality. From the nine examples I looked over, GeForce Experience did come close to the settings I would have selected. But it also biases to Nvidia-specific features like PhysX (which it cranks up to High in Borderlands 2) and against those introduced by AMD (including TressFX in Tomb Raider). Disabling FXAA in a game like Skyrim just doesn’t make sense given near-100 FPS average frame rates. It’ll likely be more important for enthusiasts to have GeForce Experience installed once Shield starts shipping; the Game Streaming feature appears to be enabled through Nvidia’s app.
Update, 5/28: Nvidia sent over some feedback indicating that TressFX is disabled by GeForce Experience using the configuration and resolution we selected in Tomb Raider because previous versions of the game achieved a significantly lower performance level. So, it seems there will be some delay between when drivers or patches dramatically improve frame rates, and when GeForce Experience takes those improvements into consideration. For now, I maintain that this is a mismatch in the settings an enthusiast would use compared to the automated system. In Skyrim, however, Nvidia clarifies that FXAA is not enabled by default because this feature is actually bad for image quality when used in conjunction with MSAA. Thus, the decision to keep FXAA turned off was deliberate, and not an oversight. Finally, the company is adamant it does not weigh its own technologies higher than the competition's. To illustrate, it points out that PhysX is disabled in Planetside 2, favoring other detail settings that contribute more to the experience. Specifically, "PhysX effects can improve the game play experience immensely (as seen in a game like Batman: Arkham City), but PhysX or TXAA are never given higher priority in GFE just because they’re Nvidia technologies." We appreciate Nvidia's feedback, which of course is the product of many more game titles than we have available here for testing. As GeForce Experience matures, we look forward to seeing how gamers at all levels react to it.
ShadowPlay: An Always-On DVR For Gaming
A long time ago, in a galaxy far away, I was pretty into WoW. Like, I raided four days a week, multiple hours a day. Any time our guild tackled a new boss, we’d record every attempt in the hopes we’d score a kill. When we did, one of us would post a video of it. Invariably, the people recording the video needed a decently high-end machine, Fraps, and plenty of storage space for all of those hours of wipes.
I don’t do that anymore. But when Nvidia talked to me about its new ShadowPlay feature, raiding is what instantly came to mind.

Enabled, ShadowPlay leverages the NVEnc fixed-function encoder built into Kepler-based GPUs and automatically records the last 20 minutes of game play. Or, you can manually control when ShadowPlay starts and stops doing its thing. This replaces software-based solutions like Fraps, which exact a more demanding load on your host processor.
As a quick refresher from our GeForce GTX 680 launch story, NVEnc is limited to H.264 encodes at up 4096x4096. ShadowPlay is not yet available, but Nvidia says that when shows up later this summer, it’ll do 1080p recordings at up to 30 FPS. I’d like to see it capable of higher resolutions, given earlier claims that the encoder can handle them in hardware.
- GK110 Gets A Little Bit Leaner
- GeForce GTX 780: The Card
- GeForce Experience And ShadowPlay
- GPU Boost 2.0 And Troubleshooting Overclocking
- Test Setup And Benchmarks
- Single-Card Results: Battlefield 3
- Single-Card Results: BioShock Infinite
- Single-Card Results: Borderlands 2
- Single-Card Results: Crysis 3
- Single-Card Results: Far Cry 3
- Single-Card Results: Hitman: Absolution
- Single-Card Results: The Elder Scrolls V: Skyrim
- Single-Card Results: Tomb Raider
- Multi-GPU Results: Battlefield 3
- Multi-GPU Results: BioShock Infinite
- Multi-GPU Results: Borderlands 2
- Multi-GPU Results: Crysis 3
- Multi-GPU Results: Far Cry 3
- Multi-GPU Results: Hitman: Absolution
- Multi-GPU Results: The Elder Scrolls V: Skyrim
- Multi-GPU Results: Tomb Raider
- Heat, Noise, And Cooling
- Power Consumption And GPU Boost
- OpenGL: 2D And 3D Performance
- DirectX And CAD: 2D And 3D Performance
- CUDA Performance
- OpenCL: Single-Precision
- OpenCL: Double-Precision
- GeForce GTX 780: Another GK110-Based Card For Wealthy Gamers



Of course, one could argue that as we get closer to higher-end products, the performance increase is always minimal and price to performance ratio starts to increase, however, for the past 3-4 years (or so I guess), never has it been that the 2nd highest-end GPU having such low performance difference with the highest-end GPU. It's usually significant enough that the highest end GPU (GTX x80) still has it's place.
Tl;dr,
The GTX Titan was released to make the GTX 780 look incredibly good, and people (especially on the internet), will spread the news fast enough claiming the $650 release price for the GTX 780 is good and reasonable, and people who didn't even bother reading reviews and benchmarks, will take their word and pay the premium for GTX 780.
Nvidia is taking a different route to compete with AMD or one could say that they're not even trying to compete with AMD in terms of price/performance (at least for the high-end products).
Of course, one could argue that as we get closer to higher-end products, the performance increase is always minimal and price to performance ratio starts to increase, however, for the past 3-4 years (or so I guess), never has it been that the 2nd highest-end GPU having such low performance difference with the highest-end GPU. It's usually significant enough that the highest end GPU (GTX x80) still has it's place.
Tl;dr,
The GTX Titan was released to make the GTX 780 look incredibly good, and people (especially on the internet), will spread the news fast enough claiming the $650 release price for the GTX 780 is good and reasonable, and people who didn't even bother reading reviews and benchmarks, will take their word and pay the premium for GTX 780.
Nvidia is taking a different route to compete with AMD or one could say that they're not even trying to compete with AMD in terms of price/performance (at least for the high-end products).
Thats apretty bad analogy. A gpu is still smooth even with some of the cores/vram/etc turned off, it doesn't increase latency/frametimes/etc.
I must've missed something. Why wait a week?
Probably to get the GTX 770 launch into the picture, and maybe price cuts from AMD.
That was my opinion after I read Anandtech's review.
Not all is right at nvidia and this is just desperate times for desperate measures stuff, we now await AMD's response and if they play it right and make the node jump it could end up being very ugly.
but i don't know why people are complaining about the price because nvidia had no good competition for it at the moment and when they do they will have to reduce it
GK110 isn't a new anything. It's been around as long as the GTX 680 aka GK104 and is still part of the Kepler family. I think the new cards you're thinking of that are due sometime next year (maybe?) are the Maxwell family of cards.
I still maintain that this is what the 680 should have been a year ago, but I've beaten that horse to death too many times so I'll shut up...
No, if I meant Maxwell I would have said Maxwell. GTX 700 is GK110 but in the long and short Nvidia talked this up to be an almighty part yet we are only talking about 20% faster than the aging 7970. So now we wait for AMD's response which may still be some time yet.
I'd rather save $200+ and get a 7970GE. If Nvidia really wants to be aggressive they need to sell this for ~$550.
Granted, the price difference between this and Titan is ridiculously, making it a no-brainer purchase. Not for me though. Not upgrading from two 670s yet, hehe.