Page 1:Radeon HD 7990 And GeForce GTX 690 Duke It Out
Page 2:HIS 7970 X2: The Challenger
Page 3:EVGA GeForce GTX 690: Elegance, Illustrated
Page 4:PowerColor Devil13 HD7990: Big And Flashy
Page 5:Benchmark System
Page 6:Benchmark Results: Synthetics
Page 7:Benchmark Results: Real-World Games
Page 8:Micro-Stuttering: The Current Situation
Page 9:Micro-Stuttering: Alternate Frame Rendering (AMD)
Page 10:Micro-Stuttering: Adaptive VSync (Nvidia)
Page 11:Micro-Stuttering: Dynamic V-Sync (AMD)
Page 12:Power Consumption
Page 15:Noise Comparison Videos: Idle
Page 16:Noise Comparison Videos: 500 FPS
Page 17:Noise Comparison Videos: Game Loop
Page 18:Noise Comparison Videos: Full Load
Page 19:Just Because You're Fastest Doesn't Make You The Best
EVGA GeForce GTX 690: Elegance, Illustrated
EVGA GeForce GTX 690: A Tough-To-Beat Incumbent
Nvidia put a lot of effort into engineering a compelling reference GeForce GTX 690, and EVGA taps that implementation for its version of the card. It's only a two-slot board, and, at 1.04 kg, not very heavy.
There are actually three different versions of EVGA's GeForce GTX 690: a baseline model with a 915 MHz core and 1502 MHz memory, a Signature card running at the same clock rates (but with a unique bundle), and a water-cooled Hydro Copper Signature board operating at 993/1502 MHz.
The minimalist box design hides massive performance inside.
EVGA doesn’t just include the usual stuff in the box (a pair of power adapters and a couple of display adapters), but also a flashy poster and stickers.
The GeForce GTX 690 sports two complete GK104 GPUs (the same ones that drive Nvidia's single-chip flagship GeForce GTX 680). Consequently, EVGA's card sports a total of 3072 CUDA cores (1536 per GPU), 256 texture units (128 per GPU), and 64 ROPs (32 per GPU).
Nvidia isn’t using its old PCIe 2.0-constrained NF200 bridge chip any more to connect its graphics processors. The new switch is PLX's PEX 8747, which supports 48 lanes of third-gen PCI Express connectivity. Again, 16 go to an upstream port, while 16 each create downstream ports attached to the GPUs. Rated latencies as low as 126 ns should help with expedient transfers between the host and GPUs.
Each GPU is mated to 2 GB of GDDR5 memory over a 256-bit interface. Again, this is the same configuration we know from the Nvidia GeForce GTX 680, giving us a similar peak bandwidth number of 192 GB/s.
The cores themselves run slightly slower than the single-chip card's, though. Each GK104 on the GeForce GTX 690 operates at 915 MHz, rather than the 680's 1006 MHz. So long as Nvidia's 300 W TDP rating isn't exceeded, GPU Boost should be able to push clock rates up to 1019 MHz, which is only a little lower than the GeForce GTX 680’s maximum frequency.
Two eight-pin connectors, together with the PCI Express slot, combine to deliver up to 375 W. This board's predecessor, the GeForce GTX 590, also hit that limit. But dual-GPU cards usually use a little less power than two equivalent single-GPU cards running in SLI. As such, 300 W could be a realistic figure for EVGA's GeForce GTX 690.
The GeForce GTX 690 has three DVI connectors and one DisplayPort output, allowing the card to drive up to four displays simultaneously.
EVGA is particularly proud of its warranty coverage, which lasts for three years and is fully transferable. So, if you're the sort to buy the best of the best every year, whoever picks up your left-over GeForce GTX 690 on eBay in 2013 should still be covered. The company also offers warranty extension out to five or 10 years, though we see absolutely zero value in protecting a decade-old graphics card.
Also high up on the EVGA's list of accolades is its Precision X software, which Nvidia used to illustrate the functionality of GPU Boost back when it launched GeForce GTX 680. The software facilitates core and memory clock rate control, fan speed tuning, and real-time monitoring of the 690's vital attributes.
Lastly, we were excited to see EVGA launch controller software for the LED under this card's GeForce GTX logo (up on the top edge of the card). Nvidia told us something like this was in development back when we first reviewed the 690, but it wasn't ready yet. Used together with Precision X, the little utility can increase/decrease the LED's brightness based on GPU utilization, clock rate, or frame rate. Pretty cool, and only compatible with EVGA's GeForce GTX 690.
- Radeon HD 7990 And GeForce GTX 690 Duke It Out
- HIS 7970 X2: The Challenger
- EVGA GeForce GTX 690: Elegance, Illustrated
- PowerColor Devil13 HD7990: Big And Flashy
- Benchmark System
- Benchmark Results: Synthetics
- Benchmark Results: Real-World Games
- Micro-Stuttering: The Current Situation
- Micro-Stuttering: Alternate Frame Rendering (AMD)
- Micro-Stuttering: Adaptive VSync (Nvidia)
- Micro-Stuttering: Dynamic V-Sync (AMD)
- Power Consumption
- Noise Comparison Videos: Idle
- Noise Comparison Videos: 500 FPS
- Noise Comparison Videos: Game Loop
- Noise Comparison Videos: Full Load
- Just Because You're Fastest Doesn't Make You The Best