I continue to be impressed by Nvidia’s industrial design. Sexy isn’t a word I typically ascribe to a piece of PC hardware, but it’s hard not to admire the GeForce GTX 690, Titan, and now the 780. Aesthetically, GeForce GTX 780 is almost identical to Titan, aside from the GTX 780 etched in the front of the card’s shroud. Consequently, everything I said about Titan in my February launch coverage applies here, too.

We’re looking at another 10.5”-long board, which, again, is half of an inch shorter than AMD’s Radeon HD 7970. And whereas the Tahiti-powered competition employs a cheaper-feeling plastic cover, GTX 780 sports a familiar aluminum shell surrounding a polycarbonate window that peers into the card’s heat sink. Unfortunately, the magnesium alloy fan housing we were treated to on GeForce GTX 690 is gone, just as it was with Titan.

Similar also is the centrifugal fan that effectively exhausts heated air from the 780’s rear I/O panel. This is a particularly big deal in multi-card configurations, since you don’t want waste heat getting recirculated back into your case, sabotaging your CPU overclock.

One thing Nvidia says it does improve on GeForce GTX 780 compared to Titan, however, is fan control. The company boasts that it developed a controller with an adaptive temperature filter and a thermally-targeted software algorithm that more intelligently maintains consistent fan speeds. On paper, the change is quite small—100-RPM fluctuations are tightened up to a roughly 20-RPM range. But that’s enough, purportedly, to bring noise down a few decibels compared to the GeForce GTX 680.
Oddly, no acoustic comparisons are drawn to Titan, which we assume means that noise is comparable between the two cards. This makes sense, given similar GPUs and industrial designs. Moreover, the GeForce GTX 780 and Titan share a 250 W TDP. So it’s hardly a surprise that they both have eight- and six-pin auxiliary power connectors, too.

Display connectivity also gets replicated from the GeForce GTX Titan to the 780. Two dual-link DVI outputs, HDMI, and DisplayPort enable as many as four simultaneous screens. You can use three in Surround and a fourth as a Windows desktop screen, just sort of hanging out. By far, triple-screen arrays are the most attractive with a card like this, though.
Nvidia says that the GeForce GTX 780 will completely replace the GeForce GTX 680 in its line-up and sell for $650. That’s an almost $200 premium for 780’s additional performance in games and a bit of acoustic dampening. Meanwhile, Radeon HD 7970 GHz Edition cards are selling for $450 and significantly outpacing the old GK104-based 680. Without a solution back down in the GeForce GTX 680’s price range, AMD is going to eat that space up (particularly with non-GHz Edition cards going for $400). It’s a pretty poorly-kept secret that more GeForce GTX 700-series cards are on their way, and those should help. The question for today is: does GeForce GTX 780’s speed warrant that greater-than 40% increase in price?
- GK110 Gets A Little Bit Leaner
- GeForce GTX 780: The Card
- GeForce Experience And ShadowPlay
- GPU Boost 2.0 And Troubleshooting Overclocking
- Test Setup And Benchmarks
- Single-Card Results: Battlefield 3
- Single-Card Results: BioShock Infinite
- Single-Card Results: Borderlands 2
- Single-Card Results: Crysis 3
- Single-Card Results: Far Cry 3
- Single-Card Results: Hitman: Absolution
- Single-Card Results: The Elder Scrolls V: Skyrim
- Single-Card Results: Tomb Raider
- Multi-GPU Results: Battlefield 3
- Multi-GPU Results: BioShock Infinite
- Multi-GPU Results: Borderlands 2
- Multi-GPU Results: Crysis 3
- Multi-GPU Results: Far Cry 3
- Multi-GPU Results: Hitman: Absolution
- Multi-GPU Results: The Elder Scrolls V: Skyrim
- Multi-GPU Results: Tomb Raider
- Heat, Noise, And Cooling
- Power Consumption And GPU Boost
- OpenGL: 2D And 3D Performance
- DirectX And CAD: 2D And 3D Performance
- CUDA Performance
- OpenCL: Single-Precision
- OpenCL: Double-Precision
- GeForce GTX 780: Another GK110-Based Card For Wealthy Gamers
Of course, one could argue that as we get closer to higher-end products, the performance increase is always minimal and price to performance ratio starts to increase, however, for the past 3-4 years (or so I guess), never has it been that the 2nd highest-end GPU having such low performance difference with the highest-end GPU. It's usually significant enough that the highest end GPU (GTX x80) still has it's place.
Tl;dr,
The GTX Titan was released to make the GTX 780 look incredibly good, and people (especially on the internet), will spread the news fast enough claiming the $650 release price for the GTX 780 is good and reasonable, and people who didn't even bother reading reviews and benchmarks, will take their word and pay the premium for GTX 780.
Nvidia is taking a different route to compete with AMD or one could say that they're not even trying to compete with AMD in terms of price/performance (at least for the high-end products).
Of course, one could argue that as we get closer to higher-end products, the performance increase is always minimal and price to performance ratio starts to increase, however, for the past 3-4 years (or so I guess), never has it been that the 2nd highest-end GPU having such low performance difference with the highest-end GPU. It's usually significant enough that the highest end GPU (GTX x80) still has it's place.
Tl;dr,
The GTX Titan was released to make the GTX 780 look incredibly good, and people (especially on the internet), will spread the news fast enough claiming the $650 release price for the GTX 780 is good and reasonable, and people who didn't even bother reading reviews and benchmarks, will take their word and pay the premium for GTX 780.
Nvidia is taking a different route to compete with AMD or one could say that they're not even trying to compete with AMD in terms of price/performance (at least for the high-end products).
Thats apretty bad analogy. A gpu is still smooth even with some of the cores/vram/etc turned off, it doesn't increase latency/frametimes/etc.
I must've missed something. Why wait a week?
Probably to get the GTX 770 launch into the picture, and maybe price cuts from AMD.
That was my opinion after I read Anandtech's review.
Not all is right at nvidia and this is just desperate times for desperate measures stuff, we now await AMD's response and if they play it right and make the node jump it could end up being very ugly.
but i don't know why people are complaining about the price because nvidia had no good competition for it at the moment and when they do they will have to reduce it
GK110 isn't a new anything. It's been around as long as the GTX 680 aka GK104 and is still part of the Kepler family. I think the new cards you're thinking of that are due sometime next year (maybe?) are the Maxwell family of cards.
I still maintain that this is what the 680 should have been a year ago, but I've beaten that horse to death too many times so I'll shut up...
No, if I meant Maxwell I would have said Maxwell. GTX 700 is GK110 but in the long and short Nvidia talked this up to be an almighty part yet we are only talking about 20% faster than the aging 7970. So now we wait for AMD's response which may still be some time yet.
I'd rather save $200+ and get a 7970GE. If Nvidia really wants to be aggressive they need to sell this for ~$550.
Granted, the price difference between this and Titan is ridiculously, making it a no-brainer purchase. Not for me though. Not upgrading from two 670s yet, hehe.