Sign in with
Sign up | Sign in

EVGA GeForce GTX 690: Elegance, Illustrated

Radeon HD 7990 And GeForce GTX 690: Bring Out The Big Guns
By

EVGA GeForce GTX 690: A Tough-To-Beat Incumbent

Nvidia put a lot of effort into engineering a compelling reference GeForce GTX 690, and EVGA taps that implementation for its version of the card. It's only a two-slot board, and, at 1.04 kg, not very heavy.

There are actually three different versions of EVGA's GeForce GTX 690: a baseline model with a 915 MHz core and 1502 MHz memory, a Signature card running at the same clock rates (but with a unique bundle), and a water-cooled Hydro Copper Signature board operating at 993/1502 MHz.

The minimalist box design hides massive performance inside.

EVGA doesn’t just include the usual stuff in the box (a pair of power adapters and a couple of display adapters), but also a flashy poster and stickers.

The GeForce GTX 690 sports two complete GK104 GPUs (the same ones that drive Nvidia's single-chip flagship GeForce GTX 680). Consequently, EVGA's card sports a total of 3072 CUDA cores (1536 per GPU), 256 texture units (128 per GPU), and 64 ROPs (32 per GPU).

Nvidia isn’t using its old PCIe 2.0-constrained NF200 bridge chip any more to connect its graphics processors. The new switch is PLX's PEX 8747, which supports 48 lanes of third-gen PCI Express connectivity. Again, 16 go to an upstream port, while 16 each create downstream ports attached to the GPUs. Rated latencies as low as 126 ns should help with expedient transfers between the host and GPUs.

Each GPU is mated to 2 GB of GDDR5 memory over a 256-bit interface. Again, this is the same configuration we know from the Nvidia GeForce GTX 680, giving us a similar peak bandwidth number of 192 GB/s.

The cores themselves run slightly slower than the single-chip card's, though. Each GK104 on the GeForce GTX 690 operates at 915 MHz, rather than the 680's 1006 MHz. So long as Nvidia's 300 W TDP rating isn't exceeded, GPU Boost should be able to push clock rates up to 1019 MHz, which is only a little lower than the GeForce GTX 680’s maximum frequency.

Two eight-pin connectors, together with the PCI Express slot, combine to deliver up to 375 W. This board's predecessor, the GeForce GTX 590, also hit that limit. But dual-GPU cards usually use a little less power than two equivalent single-GPU cards running in SLI. As such, 300 W could be a realistic figure for EVGA's GeForce GTX 690.

The GeForce GTX 690 has three DVI connectors and one DisplayPort output, allowing the card to drive up to four displays simultaneously.

EVGA is particularly proud of its warranty coverage, which lasts for three years and is fully transferable. So, if you're the sort to buy the best of the best every year, whoever picks up your left-over GeForce GTX 690 on eBay in 2013 should still be covered. The company also offers warranty extension out to five or 10 years, though we see absolutely zero value in protecting a decade-old graphics card.

Also high up on the EVGA's list of accolades is its Precision X software, which Nvidia used to illustrate the functionality of GPU Boost back when it launched GeForce GTX 680. The software facilitates core and memory clock rate control, fan speed tuning, and real-time monitoring of the 690's vital attributes. 

Lastly, we were excited to see EVGA launch controller software for the LED under this card's GeForce GTX logo (up on the top edge of the card). Nvidia told us something like this was in development back when we first reviewed the 690, but it wasn't ready yet. Used together with Precision X, the little utility can increase/decrease the LED's brightness based on GPU utilization, clock rate, or frame rate. Pretty cool, and only compatible with EVGA's GeForce GTX 690.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 143 comments.
This thread is closed for comments
Top Comments
  • 25 Hide
    twinshadow , November 8, 2012 8:47 AM
    if you are spending 1000$ dollars on a video card paying a Power bill is not an issue
  • 24 Hide
    abbadon_34 , November 8, 2012 7:57 AM
    wow, microstuttering is a now a non issue , at least AMD
  • 21 Hide
    Anonymous , November 8, 2012 6:43 AM
    thanks for the in depth analysis with adaptive V-sync and radeon pro helping with micro stutter.

    not to take away anything for the hard work performed; i would have liked have seen nvidia's latest beta driver, 310.33, included also to see if nvidia is doing anything to improve the performance of their card instead of just adding 3d vision, AO, and sli profiles.
Other Comments
  • 7 Hide
    mayankleoboy1 , November 8, 2012 5:40 AM
    IMHO, the GTX690 looks best. There is something really alluring about shiny white metallic shine and the fine metal mesh. Along with the fluorescent green branding.
    Maybe i am too much of a retro SF buff :) 
  • 12 Hide
    tacoslave , November 8, 2012 6:03 AM
    i wept
  • 18 Hide
    hellfire24 , November 8, 2012 6:04 AM
    your test system is sexy!!!!!!!
  • 20 Hide
    willyroc , November 8, 2012 6:05 AM
    You can't really go wrong either way with these generally insane(so to speak) cards.
  • -7 Hide
    amuffin , November 8, 2012 6:34 AM
    Is it just me or do the 7970X2 and 7990 coolers look so fast and fugly? :heink: 
  • 21 Hide
    Anonymous , November 8, 2012 6:43 AM
    thanks for the in depth analysis with adaptive V-sync and radeon pro helping with micro stutter.

    not to take away anything for the hard work performed; i would have liked have seen nvidia's latest beta driver, 310.33, included also to see if nvidia is doing anything to improve the performance of their card instead of just adding 3d vision, AO, and sli profiles.
  • 18 Hide
    esrever , November 8, 2012 6:45 AM
    can we get some quadfire benchmarks too? :D 
  • -6 Hide
    RazorBurn , November 8, 2012 6:55 AM
    AMD's Dual GPU at 500+ Watts of electricity is out for me.. Too Much Power and Noise..
  • 7 Hide
    mohit9206 , November 8, 2012 6:56 AM
    2 670's in sli is better than spending on a 690 and 2 7950's in Xfire is better than spending on a 7990. this way you save nearly $300 both ways
  • 24 Hide
    abbadon_34 , November 8, 2012 7:57 AM
    wow, microstuttering is a now a non issue , at least AMD
  • 10 Hide
    ojas , November 8, 2012 8:02 AM
    Good read!

    But, would have liked to see 680s in SLI, to see how they scale now compared to the 690.

    Also, would using two single GPUs in CF/SLI make a difference to the micro-stuttering charts? iirc, the PCIe controller is tied to the CPU for SB/IB chips? So that would mean no 3rd party bridge in between the two GPUs as in the case of the 7990 and 690. Would that make a diff?

    How do you manage to isolate the cards' power consumption at load (idle is simpler)? And noise too: how do you block out the case fans and CPU cooler?
  • 1 Hide
    Anonymous , November 8, 2012 8:07 AM
    The radeon pro is saving AMD's butt

    But In the end, 690 was slower than 7990 average framerate but with Radeon Pro, it is the 7990 which is slower right?

    So yes it's better than without, but the 690 is faster, as smooth, and use a built in technology

    AMD really need to work on it's crossfire technology
  • 12 Hide
    blazorthon , November 8, 2012 8:08 AM
    amuffinIs it just me or do the 7970X2 and 7990 coolers look so fast and fugly?


    I don't think they look "fast and ugly", although I do think that the HIS model could do with some more finesse.
  • 10 Hide
    FormatC , November 8, 2012 8:17 AM
    Quote:
    How do you manage to isolate the cards' power consumption at load (idle is simpler)? And noise too: how do you block out the case fans and CPU cooler?
    The noise was measured with the open benchtable, not in case (no extra case fans and an ultra silent fan on the hidden CPU cooler)

    For the power consumption: 3 current clamps with monitoring ;) 
  • 17 Hide
    Novuake , November 8, 2012 8:42 AM
    Interesting, AMD has a winner at the top tier! That hasn't happened in a while. CODOS to that.
  • 25 Hide
    twinshadow , November 8, 2012 8:47 AM
    if you are spending 1000$ dollars on a video card paying a Power bill is not an issue
  • 6 Hide
    blazorthon , November 8, 2012 8:47 AM
    NovuakeInteresting, AMD has a winner at the top tier! That hasn't happened in a while. CODOS to that.


    Technically, HIS has a winner, not AMD because AMD didn't launch a 7990/7970X2 reference;)
  • -5 Hide
    blazorthon , November 8, 2012 8:54 AM
    twinshadowif you are spending 1000$ dollars on a video card paying a Power bill is not an issue


    Actually, the only person who I ever recommended a GTX 690 to wanted it specifically because of its low power consumption literally being enough to pay for itself compared to his previous graphics setup due to his high cost for power. Some people looking for such high end cards most certainly do care about power consumption.
  • 18 Hide
    FormatC , November 8, 2012 8:56 AM
    1 kW/h in Germany: 0.25 Euro (approx. 0.34 USD)
    This IS an issue. ;) 
Display more comments