Skip to main content

The Story Of How GeForce GTX 690 And Titan Came To Be

Want To Know The Back-Story Of GeForce GTX 690 And Titan?

When Nvidia’s GeForce GTX 680 launched, it was both faster and less expensive than AMD’s Radeon HD 7970. But the GK104-based card was still pretty familiar-looking. Despite its conservative power consumption and quiet cooler, the card employed a light, plastic shroud.

That’s not really a big deal in the high-end space. For as far back as I can remember, $500 gaming boards employed just enough metal to support the same plastic shell, but lacked any substantial heft. Cooling fans typically ranged from tolerable under load to “…the heck were they thinking?” And power—well, I was just stoked to see a flagship from Nvidia dip below 200 W.

But then came GeForce GTX 690. And then Titan. And then 780. And then 770. Nvidia’s reference design for each was—and I’m going to sound like a total hardware dweeb—beautiful. Not femininely beautiful à la Monica Bellucci’s lips, but mechanically, like a DBS. The 690 introduced us to a chromium-plated aluminum frame, magnesium alloy fan housing, a vapor chamber heat sink over each GPU covered with a polycarbonate window, and LED lighting on top of the card, controllable through a software applet that companies like EVGA make available to download.

The models that followed actually used slightly different materials for the shroud, and weren’t quite as chiseled. But the aesthetic was similarly high-end. Each time a shipment of cards showed up, I was excited…and that doesn’t happen very often.

Nvidia's GeForce GTX 690 concept exploration

It’s Not Easy To Impress An Old-Timer

Alright, at 33 I’m not really that old. But I started reviewing hardware back when I was 18—more than 15 years ago. In the last decade and a half, only a few components really got my blood pumping. Previewing a very early Pentium III and 820-based motherboard with RDRAM months before Intel introduced the platform was cool. So too was the first time I got my hands on ATI’s Rage Fury MAXX. When AMD launched its Athlon 64 FX-51 CPU, it shipped a massive high-end system that I used for months after in the lab. And who could forget the Radeon HD 5870 Eyefinity 6 Edition, if only for the novelty?

I was easily as excited about GeForce GTX 690. I have a dead one in my office, and I swear it needs to live out the rest of its days in a frame on the wall.

The GeForce GTX 690 concept gets refined

Of course, when I’m evaluating a piece of hardware, good looks rank relatively low on the list of attributes that dictate a recommendation. Performance, power, noise, and pricing are also natural considerations. So, in the conclusion for GeForce GTX 690 Review: Testing Nvidia's Sexiest Graphics Card, we pointed out that two GTX 680s are faster and better equipped to handle the heat they generate. Then I brought up then-dismal availability. But all the while, I wanted someone at Nvidia to tell me what it took, at a company level, to make the card’s industrial design a reality when it clearly seemed “overbuilt.” Who was responsible? Would GeForce GTX 690 forever alter the minimum acceptable bling on a high-end gaming card?

At the time, Nvidia still had GeForce GTX Titan, 780, and 770 in its back pocket. When I approached Nvidia’s PR team for a more intimate look at what went into 690, it said “we’ll see.”

The GeForce GTX 690 preliminary design freeze

That was more than a year ago though, and now Nvidia’s plans are all much more public. Several months back, during a trip up to Santa Clara for a deep-dive on Tegra 4, I sat down with Andrew Bell, vice president of hardware engineering, and Jonah Alben, senior vice president of GPU engineering to talk about the conception of this design that I admired so much and what it took make a reality.

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.