PCI Express 3.0 Representing
There’s a story behind Nvidia’s support for third-gen PCI Express and Intel’s X79 Express platform. But it requires a little bit of history.
Way back when I first previewed Sandy Bridge-E (check out Intel Core i7-3960X (Sandy Bridge-E) And X79 Platform Preview for that little piece of history), everyone I talked to insisted that the processor’s PCIe controller wasn’t going to be validated at 8 GT/s data rates. It'd be a PCIe 2.0 part. Then, suddenly the story changed and it was called 8 GT/s-capable (though mention of the standard itself was left out).
When AMD launched its Radeon HD 7000-series cards, we were able to demonstrate them operating at PCI Express 3.0 signaling speeds. Then, Nvidia launched its GeForce GTX 680—with a press driver that was limited to 5 GT/s. The company sent us a second version to show that PCI Express 3.0 was working, and assured us that it’d operate at 8 GT/s on Ivy Bridge-based platforms (which we’ve since confirmed).
Why not just ship it like that? There was a reason, we are digging deeper, but aren’t yet ready to discuss the results.
Let’s put the puzzle pieces together, though.
- X79 and Sandy Bridge-E were originally going to operate at second-gen signaling rates.
- GeForce GTX 680, a card that scales really well in SLI, operates at 5 GT/s data rates attached to Sandy Bridge-E processors and 8 GT/s in Ivy Bridge-based platforms.
- GeForce GTX 690 offers 8 GT/s signaling in both Sandy Bridge-E and Ivy Bridge-based platforms.
The issue doesn’t appear to be related to GK104, Nvidia’s card, or its driver. Rather, it’d seem to relate back to our original report that Sandy Bridge-E was not fully validated for PCI Express 3.0.
GTX 690 at PCIe 3.0 on X79
GTX 680 at PCIe 2.0 on X79
Is This It For Affluent Gamers In 2012?
I saw a lot of comments from folks who read GeForce GTX 680 2 GB Review: Kepler Sends Tahiti On Vacation and decided they wanted to wait for Nvidia to launch a desktop-oriented card based on a more complex graphics processor—if only because they were unwilling to pay $500 for the company’s next-gen “Hunter” (if you don’t know what I’m talking about, check out the first page of my GeForce GTX 680 review).
On behalf of those folks, I plied Nvidia for more information about a proper “Tank” in the GeForce GTX 600-series. Although the company’s representatives were deliberately vague about the existence of another GPU, they clearly indicated that GeForce GTX 690 wouldn’t be eclipsed any time soon. Personally, I’d be surprised to see anything based on a higher-end GPU before Q4.
Even then, there’s no guarantee that a tank-class card would outperform two GK104s (GF104 had little trouble destroying GF100 in Amazing SLI Scaling: Do Two GeForce GTX 460s Beat One GTX 480?, after all). The more likely outcome would be a better-balanced GPU able to game and handle compute-oriented tasks.
- GeForce GTX 690 4 GB: Hands-Off The Magnesium, Pal!
- Overclocking And Tessellation Performance
- PCI Express 3.0 And What Of GK110?
- Test Setup And Benchmarks
- Benchmark Results: 3DMark 11 (DX 11)
- Benchmark Results: Battlefield 3 (DX 11)
- Benchmark Results: Crysis 2 (DX 9 And DX 11)
- Benchmark Results: The Elder Scrolls V: Skyrim (DX 9)
- Benchmark Results: DiRT 3 (DX 11)
- Benchmark Results: World of Warcraft: Cataclysm (DX 11)
- Benchmark Results: Metro 2033 (DX 11)
- Benchmark Results: Sandra 2012 And LuxMark 2.0
- Noise And Temperatures
- Power Consumption
- GeForce GTX 690 4 GB: Beauty Isn’t Always Practical
6990 aren't that hard to find. Back in January before the 7000 series came out, you could easily pick one or two up. I am pretty sure in 6 month, the 690 will be easy to grab.
Also the design of this reminds me of my old leadtek 5900
http://www.ixbt.com/video2/images/gffx-27/leadtek-5900lx-front.jpg
GTX 590 launch...
GTX 580 launch price: $500
GTX 590 launch price: $700
Difference: 140%
Now today GTX 690 launch...
GTX 680 launch price: $500
GTX 690 launch price: $1000
Difference: 200%
So, is it just me or is nVidia really gouging on the price here?
Why the hell else would they be charging an additional 43% more than their last dual GPU launch while using less silicon?
Come on AMD, we really need some more competition here.
The only use case for that much graphics hardware is excessively high multi monitor resolutions like 6k x 2k in 3d. AMD doesn't need to invest in new PCB designs to make a dual cpu GCN chip yet because the demand is so low.