Nvidia GeForce GTX 780 Review: Titan’s Baby Brother Is Born

Power Consumption And GPU Boost

Clock Rates and Thermal Limits

We already covered GPU Boost 2.0 and, small though they may be, the subtle changes Nvidia made to GeForce GTX 780 compared to Titan. Specifically, we see that both cards start limiting core clock rates once the GPU reaches 60°C, dialing back the performance gains attributable to GPU Boost a bit. This becomes more pronounced on the GeForce GTX 780 as its core temperature rises, though the 780 exhibits much greater and more varied clock speed fluctuations, while the GTX Titan is basically limited to three clock levels. While GeForce GTX Titan practically loses the ability to boost its clock speeds as soon as it reaches its thermal limit, the graph still shows 780’s core frequency spiking upward consistently.

Power Consumption

In less demanding applications (including games), the GeForce GTX 780 uses slightly less power than Titan. Although both cards bear the same TDP, this is still somewhat surprising. You'd think that the pared-back hardware would be less power-hungry. The difference are in-line with what we're used to seeing from two similar cards with different amounts of RAM. In other words, it seems that the deactivated hardware blocks on the GTX 780 are still drawing power.

Also curious, the GeForce GTX 780 appears to use more power under full load than the Titan until it reaches its thermal limit. Meanwhile, the bigger card runs into its thermal limit sooner, while still offering more performance. One explanation is that the Titan operates closer to the GK110 GPU’s sweet spot, while the GTX 780 relies on higher clock speeds for its performance.  

As long as the cards stay within their predefined thermal limits, they can hit power peaks beyond what their nominal TDP would allow. In practice, you will see these only rarely and very briefly at that. Still, don’t forget to take them into account and pick your power supply accordingly.

Effects of the Thermal Limit

Lastly, let’s look at what happens when both cards hit their thermal limits after being under load for a while. The GeForce GTX 780’s power consumption drops from 245 to 232 W, while the GeForce GTX Titan only dips by 2 W, from 238 to 236 W. This is another example of how much more headroom GPU Boost 2.0 provides, extracting extra performance as long as the GPU remains cool enough.

This thread is closed for comments
151 comments
    Your comment
  • CrisisCauser
    A good alternative to the Titan. $650 was the original GTX 280 price before AMD came knocking with the Radeon 4870. I wonder if AMD has another surprise in store.
  • gigantor21
    GG Titan.
  • It's definitely a more reasonable priced alternative to the titan, but it's still lacking in compute. Which might disappoint some but I don't think it'll bother most people. Definitely not bad bang for buck at that price range considering how performance scales with higher priced products, but it could've been better, $550-$600 seems like a more reasonable price for this.
  • hero1
    This is what I have been waiting for. Nice review and I like the multi gpu tests. Thanks. Time to search the stores. Woohoo!!
  • natoco
    To much wasted silicon (just a failed high spec chip made last year, even the titan) and rebadged with all the failed sections turned off. I wanted to upgrade my gtx480 for a 780 but for the die size, the performance is to low unfortunately. It has certainly not hit the trifecta like the 680 did. Would you buy a V8 with 2 cylinders turned off even if it were cheaper? No, because it would not be as smooth as it was engineered to be, so using that analogy, No deal. customer lost till next year when they release a chip to the public that's all switched on, will never go down the turned off parts in chip route again.
  • EzioAs
    In my opinion, this card and the Titan is actually a clever product release by Nvidia. Much like the GTX 680 and GTX 670, the Titan was released at higher price (like the GTX 680) while the slightly slower GTX 780 (the GTX670 for the GTX600 series case) is at a significantly lower price but performing quite close to it's higher-end brother. We all remember when the GTX 670 launched it makes the GTX680 looks bad because the GTX 670 was 80% of the price while maintaining around 90-95% of the performance.

    Of course, one could argue that as we get closer to higher-end products, the performance increase is always minimal and price to performance ratio starts to increase, however, for the past 3-4 years (or so I guess), never has it been that the 2nd highest-end GPU having such low performance difference with the highest-end GPU. It's usually significant enough that the highest end GPU (GTX x80) still has it's place.

    Tl;dr,

    The GTX Titan was released to make the GTX 780 look incredibly good, and people (especially on the internet), will spread the news fast enough claiming the $650 release price for the GTX 780 is good and reasonable, and people who didn't even bother reading reviews and benchmarks, will take their word and pay the premium for GTX 780.

    Nvidia is taking a different route to compete with AMD or one could say that they're not even trying to compete with AMD in terms of price/performance (at least for the high-end products).
  • mouse24
    natocoTo much wasted silicon (just a failed high spec chip made last year, even the titan) and rebadged with all the failed sections turned off. I wanted to upgrade my gtx480 for a 780 but for the die size, the performance is to low unfortunately. It has certainly not hit the trifecta like the 680 did. Would you buy a V8 with 2 cylinders turned off even if it were cheaper? No, because it would not be as smooth as it was engineered to be, so using that analogy, No deal. customer lost till next year when they release a chip to the public that's all switched on, will never go down the turned off parts in chip route again.


    Thats apretty bad analogy. A gpu is still smooth even with some of the cores/vram/etc turned off, it doesn't increase latency/frametimes/etc.
  • godfather666
    "But, I’m going to wait a week before deciding what I’d spend my money on in the high-end graphics market. "

    I must've missed something. Why wait a week?
  • JamesSneed
    Natoco, your comment was so clueless. It is likely every single CPU or GPU you have ever purchased has fused off parts. Even the $1000 extreme Intel cpu has a little bit fused off since its a 6 core CPU but using a 8 core Zeon as its starting point. Your comparison to a car is idiotic.
  • 016ive
    You will have to be an idiot to buy a Titan now that the 780 is here...Me, I could afford neither :)
  • Sakkura
    godfather666"But, I’m going to wait a week before deciding what I’d spend my money on in the high-end graphics market. "I must've missed something. Why wait a week?

    Probably to get the GTX 770 launch into the picture, and maybe price cuts from AMD.
  • EzioAs
    696345 said:
    This review has done a great job proving how well the HD7970 GHz edition performs is as a single GPU solution. Beat the 680 in almost every benchmark on older silicon. I'm excited to see what AMD has in store with the HD9000 series.


    That was my opinion after I read Anandtech's review. :)
  • rmpumper
    Techpowerup has the Gigabyte 780 OC review and it kicks Titan in the butt - the higher the res, the better 780 is than Titan.
  • sarinaide
    Its about a year ago Kepler was introduced in a blaze of glory, less than a year and its been cast aside for a new generation well before its intended release date, around 8 months sooner than its expected release that toms mentioned was march, conversely Tahiti and Cape Verde was released in Nov 2011 and while Cape Verde is EOL and replaced by a faster and lightly powered Bonaire, Tahiti is still going strong. I am awaiting Toms benches on the new catalyst 13.5 drivers once out as I think we will see more gains from what is now archiac of an arch.

    Not all is right at nvidia and this is just desperate times for desperate measures stuff, we now await AMD's response and if they play it right and make the node jump it could end up being very ugly.
  • sephirothmk
    Can shadowplay record more than 20 minutes?
  • kammak743
    What would be really awesome is if the GTX 790 was either a GK110 with nothing disabled or 2 GK110's with something disabled (although it would be amazing 2 full power GK110's)
    but i don't know why people are complaining about the price because nvidia had no good competition for it at the moment and when they do they will have to reduce it
  • sarinaide
    122690 said:
    sarinaideIts about a year ago Kepler was introduced in a blaze of glory, less than a year and its been cast aside for a new generation well before its intended release date, around 8 months sooner than its expected release that toms mentioned was march, conversely Tahiti and Cape Verde was released in Nov 2011 and while Cape Verde is EOL and replaced by a faster and lightly powered Bonaire, Tahiti is still going strong. I am awaiting Toms benches on the new catalyst 13.5 drivers once out as I think we will see more gains from what is now archiac of an arch.Not all is right at nvidia and this is just desperate times for desperate measures stuff, we now await AMD's response and if they play it right and make the node jump it could end up being very ugly.
    GK110 isn't a new anything. It's been around as long as the GTX 680 aka GK104 and is still part of the Kepler family. I think the new cards you're thinking of that are due sometime next year (maybe?) are the Maxwell family of cards. I still maintain that this is what the 680 should have been a year ago, but I've beaten that horse to death too many times so I'll shut up...


    No, if I meant Maxwell I would have said Maxwell. GTX 700 is GK110 but in the long and short Nvidia talked this up to be an almighty part yet we are only talking about 20% faster than the aging 7970. So now we wait for AMD's response which may still be some time yet.
  • cknobman
    At $650 I am just not seeing it. In fact I dont even see this card putting any pressure on AMD to do something.

    I'd rather save $200+ and get a 7970GE. If Nvidia really wants to be aggressive they need to sell this for ~$550.
  • TheMadFapper
    Exactly what happened between the 670 and 680, and exactly why I bought two 670s instead of spending another $120 on a 2-5% increase in performance.

    Granted, the price difference between this and Titan is ridiculously, making it a no-brainer purchase. Not for me though. Not upgrading from two 670s yet, hehe.
  • Sakkura
    59464 said:
    At $650 I am just not seeing it. In fact I dont even see this card putting any pressure on AMD to do something. I'd rather save $200+ and get a 7970GE. If Nvidia really wants to be aggressive they need to sell this for ~$550.

    I'm guessing the pricing on the GTX 770 will be more aggressive.
  • Onus
    I would like to use nVidia for games, but my HD7970 may yet pay for itself mining BTC. A "free" HD7970? That's an impossible bang/buck factor to ignore.
  • ryang1428
    Do you guys see 680 prices dropping in the next few weeks? I have been waiting to buy one for some time now but they havent budged today
  • sarinaide
    Just punching some numbers, the 780 on the median of games TH benched is from 7%-20% faster than the HD7970, Skyrim being notoriously rough on AMD parts shows up 18%. TH and other sites previewed the GTX780 last year and touted 35-50% gains. Later SA in a review did mention it would be lucky to see 20% gains on the exact same node and low and behold we have the GTX780 somewhere in the 10-20% faster bracket, probably around 15% faster than AMD's flagship single GPU which again I stress is now closer to 2 years old while the 680 is obselete around a year in. We have all seen rumours of AMD's next GPU and if rumours are true and AMD jumps node to 20nm along with stacked RAM and doubling resources it won't look very good from a team green perspective.
  • toddybody
    natocoI wanted to upgrade my gtx480 for a 780 but for the die size, the performance is to low unfortunately. It has certainly not hit the trifecta like the 680 did.



    Youre whining about efficiency and design...and you own a 480?


    A 780 would be an INCREDIBLE upgrade over the power hungry/OLD 480.