Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked

Meet The GeForce GTX 780 Ti

Alright, so, Nvidia frankly didn’t need to do much to make its 780 Ti a sharp-looking piece of gear. I traced the history of this industrial design in The Story Of How GeForce GTX 690 And Titan Came To Be, and remain impressed with the work that Nvidia’s engineers did to make its latest high-end card aesthetically pleasing and mechanically effective.

This is GeForce GTX 780 Ti...This is GeForce GTX 780 Ti...

...and this is Titan...and this is Titan

The GeForce GTX 780 Ti is changed minimally from the design we already know. The card’s model name, etched into the fan shroud, is now painted black—a more noticeable contrast against the silver body than before. Also, the heat sink sitting under that big polycarbonate window is black as well, standing out more ominously than Titan’s aluminum fins. Because the 780 Ti is limited to 3 GB of GDDR5, the final difference is a lack of memory packages on the back of its PCB.

Otherwise—yeah, it’s a very similar-looking product that measures 10.5” long, employs the same centrifugal fan, and offers a similar display connectivity suite. You get two dual-link DVI ports, HDMI, and a full-size DisplayPort connector.

Under the hood, of course, there’s a fully-functional GK110 GPU running at higher clocks than Titan. But Nvidia cites the same 250 W TDP as Titan (indeed, that’s the number it uses for GeForce GTX 780, too). The company says that this is correct—careful binning lets it turn on more of the processor and operate at higher clocks without exceeding the 250 W board power figure.

As a result, GeForce GTX 780 Ti employs the same eight- and six-pin power connectors as 780 and Titan.

Although Nvidia sometimes limits the number of cards that can be used together, it supports four-way SLI configurations with GeForce GTX 780 Ti. Of course, you'll need a compatible platform; it isn't enough to simply use a Z87-based motherboard with its 16 lanes of third-gen PCIe divided up, for example. A properly-equipped X79 board will work, as will a mainstream system with the right PLX switch.

Nvidia also makes a big deal about software adding value to GeForce GTX 780 Ti. To begin, there’s a three-game bundle that includes Assassin’s Creed IV: Black Flag, Batman: Arkham Origins, and Splinter Cell Blacklist. I rarely get very excited about game bundles, and this one is no exception. Assassin’s Creed is a console port designed for PlayStation 3 and Xbox 360. Batman hasn’t been getting the warmest reception. And I’m personally not a devotee of the Splinter Cell franchise. Nevertheless, that’s still $170 of free games for folks interested in the trio of titles.

More compelling to me is the beta introduction of ShadowPlay (finally). Not everyone is going to get as much of a kick out of this—mostly because not everyone wants to record and play back moments from their digital conquests. However, as a former WoW raider, I have a directory of boss kill videos from back in the day that simply slammed my PC as I tried to capture them with Fraps. Offloading the encode would have been simply brilliant, and I know there are plenty of folks looking for the same functionality today. For more on ShadowPlay and its impact on gaming performance, check out Nvidia's Shield Revisited: Console Mode, Streaming, And More.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
283 comments
    Your comment
    Top Comments
  • Keep up the competition. Performance per dollar is the name of the game, and the consumers are thriving in it right now.
    37
  • Excluding the possibility of bias, it's important to note the various performance results from one site to another. Tom's has the 780Ti winning the majority of benches while others have the 290x on top for the same applications. I believe this is representative of real world end user scenarios. Individual cards and total system variances IMO will result in the 780ti and 290x performing pretty much on par at higher resolutions. Therefore it really comes down to prices or preference but I don't know too many smart people who choose to waste $s ever even 1%'s. Win for AMD.
    27
  • Can't wait for fanboy wars! Its going to be fun to watch.
    22
  • Other Comments
  • My heart broke a little bit for AMD. Unless AMD's got something up their sleeve, it's up to the board manufacturers now to get the 290X in a better competitive stance than the 780 ti.
    -14
  • Can't wait for fanboy wars! Its going to be fun to watch.
    22
  • At $700, AMD has nothing to worry about other than the minority of enthusiast who are willing to pay $200 more for the absolute fastest. Also, when games like Battlefield 4 uses mantle the performance gains will be eroded or wiped out.
    17
  • Keep up the competition. Performance per dollar is the name of the game, and the consumers are thriving in it right now.
    37
  • I want to see cooler as efficient as the 780 ti, on the 290X, and the benchmarks be run again. Something tells me 290X will perform similar or greater than 780ti, in that situation.
    14
  • Price vs way too few more fps than the rival will say a lot no matter who gets the crown, but can`t wonder to imagine the look on the face of the guys who got Titans for only few months of "fps supremacy" at insane price tags :)
    6
  • 2x R9 290's for $100 more will destroy the 780Ti. I don't really see where this logically fits in a competitively priced environment. Nice card, silly price point.
    9
  • "Hawaii-based boards delivering frame rates separated by double-digit percentages, the real point is that this behavior is designed into the Radeon R9 290X. "

    It could also come down to production variance between the chips. Seen in before in manufacturing and it's not pretty. Sounds like we're starting to hit the ceiling with these GPUs... Makes me wonder what architectural magic they'll come up with next.

    IB
    -3
  • 2x R9 290's for $100 more will destroy the 780Ti. I don't really see where this logically fits in a competitively priced environment. Nice card, silly price point.
    -1
  • I'm going to build a rig for a friend and was planning on getting him the R9 290, but after the R9 290 review I'm quite hesitant. How can we know how the retail version of that card performs? Any chance you guys could pick one up and test it out? Furthermore, how can we know Nvidia isn't pulling the same trick: i.e. giving a press card that performs way above the retail version?
    -3
  • Hoping to get a response this time. I am wondering if AA has any place in the ultraHD gaming world. I suspect that it gets cranked to 16x on ultra settings and I wonder if this actually is discernable witht he pixel density being so high. It is not like many people can spot a jagged edged curve when the "jag" is microns big.

    If it has a negligible impact on what it looks like I am wondering how performance is with single cards on ultraHD screens WITHOUT ANTI-ALIASING. Please could you investigate? or point me to somewhere that has. Cheers all!
    10
  • EVGA already has the "SC" rated cards both with the default cooler and their ACX cooler.

    Apples to apples it looks like the 780 ti will remain faster than the 290x even after we begin to see custom cooling AMD cards . . . but at a high premium.
    1
  • Good buy compared to Titan but not the $500cards is what I read out of this.Good performance but questionable value.
    3
  • what i can't understand is how people can manage to stay loyal to the green team especially when they've been using monopolistic tendencies when it comes to pricing their cards... seriously, dropping a cards price point at the snap of a finger by hundreds of dollars, and they're still profiting like monsters i bet.

    and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
    4
  • sigh....it's too expensive compared to the 290x and the 290 for the performance.Slap a waterblock on the 290x and this card and overclock both of them to the limit and we will se which one is better.Still,i'm not gonna pay an extra $300 for this card over the 290x
    15
  • sorry.i mean the r9 290
    3
  • One thing is certain - 290X is completely irrelevant. Either get a lot cheaper 290 with the same performance or expensive 780Ti with better performance.
    -3
  • This seems to be the bottom line... get the 780ti if you absolutely want the best and have money to burn (or wait a bit to see if they will indeed release the higher memory ones) and 290 for money vs performance.
    8
  • Excluding the possibility of bias, it's important to note the various performance results from one site to another. Tom's has the 780Ti winning the majority of benches while others have the 290x on top for the same applications. I believe this is representative of real world end user scenarios. Individual cards and total system variances IMO will result in the 780ti and 290x performing pretty much on par at higher resolutions. Therefore it really comes down to prices or preference but I don't know too many smart people who choose to waste $s ever even 1%'s. Win for AMD.
    27
  • 1350678 said:
    I'm going to build a rig for a friend and was planning on getting him the R9 290, but after the R9 290 review I'm quite hesitant. How can we know how the retail version of that card performs? Any chance you guys could pick one up and test it out? Furthermore, how can we know Nvidia isn't pulling the same trick: i.e. giving a press card that performs way above the retail version?

    Well, it is possible, but highly unlikely, given that Nvidia has a hard defined minimum clock rate, and a much narrower range...plus this is a reference board, it's fully possible that a retail card with a custom cooler will perform much higher (like the Gigabyte 780 in this article).

    And, it's not an issue with the Titan or the 780, which are both based on GK110...which has been out for months, and has stable drivers.

    133584 said:
    what i can't understand is how people can manage to stay loyal to the green team especially when they've been using monopolistic tendencies when it comes to pricing their cards... seriously, dropping a cards price point at the snap of a finger by hundreds of dollars, and they're still profiting like monsters i bet. and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.

    What i can't understand is why this has to be a ****ing war.

    When they had no competition, they charged a lot of money for their top end cards. No one was forced to buy these overpriced cards. If no one bought them, they'd drop prices. When AMD released solid competition, they dropped prices.

    That's how the market works. If you think AMD wouldn't do the same, well, what can i say..

    AMD haven't been in a position to do that for a long time on either the GPU or CPU front, which is why they haven't.

    When they tried to release an $800 FX CPU (this is without a monopoly or lead in the market, btw), no one bought it, and AMD had to drop prices by more than half.
    0