Sign in with
Sign up | Sign in

GeForce GTX 680 2 GB Review: Kepler Sends Tahiti On Vacation

GeForce GTX 680 2 GB Review: Kepler Sends Tahiti On Vacation
By

Enthusiasts want to know about Nvidia's next-generation architecture so badly that they broke into our content management system and took the data to be used for today's launch. Now we can really answer how Kepler fares against AMD's GCN architecture.

Nvidia is fond of drawing parallels. With its prior-generation graphics cards, the company likened each model to a different role on a virtual battlefield. GeForce GTX 480 was the tank—big performance and, to a fault, a big price, big power, and big heat, as well. GeForce GTX 460 came to be referred to as the hunter, incorporating a better balance of speed, efficiency, and cost more apropos to gamers. Finally, GeForce GTS 450 was dubbed the sniper for its focus on enabling playable frame rates at 1680x1050, according to Nvidia.

As silly as that trio of categories seemed, they make it easier for us to put a finger on the pulse of GeForce GTX 680. Though its name (and price) suggests a successor to Nvidia’s current single-GPU flagship, this is decidedly the hunter—a gamer-oriented card that almost completely de-emphasizes the once-emphatic message of improving general-purpose compute performance. But hey, it does that whole gamer thing really well, just like the GeForce GTX 460.

GK104: The hunter that succeeded a tankGK104: The hunter that succeeded a tank

Fitting In Isn’t Always Easy

Regardless of the role it was designed to fill, competition is the biggest influencer of positioning. AMD may have higher-end cards on its roadmap that we haven’t seen or heard about yet. However, in the context of AMD’s six Radeon HD 7000-series boards that are already available, Nvidia knows exactly what it’s up against.

Had Radeon HD 7970 been 30 or 40 percent faster than it is, there’s a good chance we wouldn’t be looking at a GeForce GTX 680 today. Maybe it would have been called GTX 660 or 670. But because of where AMD’s flagship sits, Nvidia sees justification in crowning its new hunter as a successor to its old tank—all the while making it pretty clear that another piece of heavy armor is in the works.  

What we have on our hands, then, is a $500 card based on Nvidia’s GK104 graphics processor, designed specifically for gamers (if you’re interested in compute potential, you’ll have to keep waiting). GeForce GTX 680 addresses some of last generation’s most glaring competitive disadvantages, and it adds a handful of interesting features, too. 

Meet GeForce GTX 680

At 10” long, the GeForce GTX 680’s PCB is half an inch longer than AMD’s Radeon HD 7800 boards and half an inch shorter than the Radeon HD 7900s.

Looking at the card head-on, we see that it employs a centrifugal fan, which pushes hot air out of the card’s rear bracket. There’s only about half of one slot available for passing exhaust back there. But as we’ll see in our thermal and acoustic tests, the GTX 680 does not have a problem contending with heat.

The rest of the dual-slot bracket plays host to four display outputs: two dual-link DVI connectors, one full-sized HDMI port, and a DisplayPort output. All of them are useable concurrently, addressing one of our sharpest critiques of all prior-gen Fermi-based boards. At long last, we can consider multi-monitor gaming tests to replace 2560x1600 in our single-card reviews (and indeed, multi-monitor benchmarks will follow in a story on which we're already working)! Like AMD, Nvidia claims that this card support HDMI 1.4a, monitors with 4K horizontal resolutions, and multi-stream audio.

Up top, GeForce GTX 680 features twin SLI connectors, enabling two-, three-, and four-way SLI configurations. In comparison, AMD’s Radeon HD 7970 and 7950 similarly support up to four-way arrays.

We also get our first piece of physical evidence that Nvidia’s GK104 processor was designed to be a viable option in more mainstream environments: GeForce GTX 680 employs two six-pin auxiliary power connectors. Those two inputs, plus a PCI Express slot, facilitate up to 225 W of power delivery. Nvidia rates this card for up to 195 W. However, it also says typical power use is closer to 170 W. Keep those numbers in mind—the available headroom between 170 W and the 225 W specification ceiling come into play shortly.

Keeping GeForce GTX 680 Cool

Nvidia claims it put a significant effort into three aspects of its dual-slot cooler’s design, contributing to an impressive acoustic footprint, even under load. We can peel back the GTX 680’s shroud for a closer look…

First, the company cites a trio of horseshoe-shaped heat pipes embedded in the GPU heat sink, which quickly draw heat away from GK104. Thermal energy is transferred from those pipes into a dual-slot aluminum sink. 

Optimizations to the sink itself also rank on Nvidia’s list of improvements. For instance, the fin stack is angled where air exits, creating more space, internally, between the cooler and exhaust grate. Apparently, in prior designs, heat was getting trapped between the bracket and fins, affecting cooling performance. This new approach is said to yield lower temperatures than the older implementation.

Finally, Nvidia says it added acoustic dampening material to the fan motor—a step it also took with the GeForce GTX 580, which contributed to getting that card’s noise level down compared to its maligned predecessor.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 327 comments.
This thread is closed for comments
Top Comments
  • 44 Hide
    borden5 , March 22, 2012 12:55 PM
    oh man this's good news for consumer, hope to see a price war soon
  • 38 Hide
    Anonymous , March 22, 2012 12:46 PM
    Hail to the new king.
  • 33 Hide
    outlw6669 , March 22, 2012 12:59 PM
    Nice results, this is how the transition to 28nm should be.
    Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released :/ 
Other Comments
  • 38 Hide
    Anonymous , March 22, 2012 12:46 PM
    Hail to the new king.
  • 44 Hide
    borden5 , March 22, 2012 12:55 PM
    oh man this's good news for consumer, hope to see a price war soon
  • 26 Hide
    johnners2981 , March 22, 2012 12:58 PM
    Damn prices, in europe we have to pay the equivalent of $650-$700 to get one
  • 33 Hide
    outlw6669 , March 22, 2012 12:59 PM
    Nice results, this is how the transition to 28nm should be.
    Now we just need prices to start dropping, although significant drops will probably not come until the GK110 is released :/ 
  • 23 Hide
    Anonymous , March 22, 2012 1:00 PM
    Finally we will see prices going down (either way :-) )
  • -4 Hide
    Scotty99 , March 22, 2012 1:03 PM
    Its a midrange card, anyone who disagrees is plain wrong. Thats not to say its a bad card, what happened here is nvidia is so far ahead of AMD in tech that the mid range card purposed to fill the 560ti in the lineup actually competed with AMD's flagship. If you dont believe me that is fine, you will see in a couple months when the actual flagship comes out, the ones with the 384 bit interface.
  • 26 Hide
    Chainzsaw , March 22, 2012 1:04 PM
    Wow not too bad. Looks like the 680 is actually cheaper than the 7970 right now, about 50$, and generally beats the 7970, but obviously not at everything.

    Good going Nvidia...
  • 32 Hide
    SkyWalker1726 , March 22, 2012 1:05 PM
    AMD will certainly Drop the price of the 7xxx series
  • 20 Hide
    rantoc , March 22, 2012 1:13 PM
    2x of thoose ordered and will be delivered tomorrow, will be a nice geeky weekend for sure =)
  • 23 Hide
    Scotty99 , March 22, 2012 1:21 PM
    scrumworksNothing surprising here. Little overclocking can put Tahiti right at the same level. Kepler is actually losing to Tahiti in really demanding games like Metro 2033 that uses the latest tech. Pointless to test ancient and low tech games like World of Warcrap that is ancient, uses dx9 and is not considered cutting edge in any meter.


    Sigh...

    WoW has had DX11 for quite a long time now. Also, go play in a 25 man raid with every detail setting on ultra with 8xAA and 16x AAF and tell me WoW is not taxing on a PC.
  • 16 Hide
    yougotjaked , March 22, 2012 1:21 PM
    Wait what does it mean by "if you’re interested in compute potential, you’ll have to keep waiting"?
  • 0 Hide
    dragonsqrrl , March 22, 2012 1:22 PM
    Just throwing this out there now, but some AMD fanboy will find a way to discredit or marginalize these results.

    ...oh, wait.
  • 24 Hide
    klausey , March 22, 2012 1:24 PM
    Great to see nVidia jumping back into the game and forcing AMD to lower its prices accordingly. I was shocked to see the card actually available at the MSRP of $500 on launch day. I guess we'll see how long that lasts.

    For everyone suggesting that nVidia will release another true "flagship" beyond the 680, I think you are spot on, IF AMD gives them a reason to. There's no reason to push it at the moment as they already hold the crown. If, on the other hand, AMD goes out and makes a 7980, or 79070 SE card with higher clocks (more like what the 7970 can achieve when properly overclocked), I definitely see nVidia stepping their game up a bit.

    Either way, it's awesome to see both AMD and now nVidia taking power consumption into consideration. I'm tired of my computer room feeling like a toaster after an all nighter.
  • 18 Hide
    rantoc , March 22, 2012 1:24 PM
    yougotjakedWait what does it mean by "if you’re interested in compute potential, you’ll have to keep waiting"?


    He means waiting for the GK110, that will be a more of a compute card while this GK104 is more equiped towards gaming.
  • 13 Hide
    EXT64 , March 22, 2012 1:26 PM
    Really disappointing DP compute, but a tradeoff had to be made and this card is meant for gaming, so I can understand their position. Hopefully GK110 is a real card and will eventually come out.
  • -7 Hide
    Anonymous , March 22, 2012 1:27 PM
    but will it run tetris with multiple displays?
  • 7 Hide
    amk-aka-Phantom , March 22, 2012 1:27 PM
    Oh yeah, team green strikes back! :D  Now let's see what 660 Ti will be like, might suggest that to a friend.
Display more comments