Sign in with
Sign up | Sign in

EVGA GTX 295 Hydro Copper (2x896 MB)

Best Of The Best: High-End Graphics Card Roundup
By

To see all pictures, please click on the photo of the test card below to access our photo gallery.

This dual-chip card is a real monster (more so than even the reference version). The massive heatsink completely fills the space between the two circuit boards. This also makes the card heavy, which incidentally raised our expectations of cooling performance from such a massive piece of gear.

In order to test the Hydro Copper from EVGA, we used the HyrdroGen cooling rig from MSI because EVGA includes no additional water-cooling parts with this card to help keep costs down. Fortunately, once the GeForce GTX 295 is connected to the water hoses and the liquid starts circulating, the card remains quiet. There's naturally no fan noise at all, even though we're used to lots of racket from cards in this performance class.

After you get used to the relative quiet, you can take a deep breath and enjoy the unusual sensation of an ultra high-end graphics card that runs almost silently. Then, we compared the clock rates for the reference board to those for this EVGA board and our jaws dropped. The standard clocks on a GTX 295 are: 576 MHz on the GPU, 1,242 MHz shaders, and graphics RAM at 2 x 999 MHz. But EVGA raises those specs substantially to 720 MHz for the GPU, 1,548 for the shader clock, and 2 x 1,080 for the graphics RAM.

In our very first benchmark, the 3D power of the dual-GPU card hits a CPU bottleneck in Fallout 3. This overclocked board enables frame rates of nearly 90 frames per second (FPS) for all test resolutions. But the best thing about the EVGA Hydro Copper is that it always stays quiet. Only the two 120 mm, seven volt fans on the MSI HydroGen Radiator make any sound at all. Anybody who has experienced the typical noise levels from an air-cooled Radeon HD 4870 X2 or a GeForce 9800 GX2 will appreciate how well-suited the GTX 295 is for water cooling.

EVGA's package here is nothing short of impressive. Compared to a GeForce GTX 295 running reference frequencies, overclocking delivers a 10.5% performance improvement across the board. Using the normal Radeon HD 4870 X2 as a baseline, the EVGA GTX 295 Hydro Copper is 29.8% faster, which makes a world of difference. At a resolution of 1920x1200, it is nearly impossible to exhaust available 3D performance without hitting a platform ceiling, while at higher resolutions, the differences are even more noticeable. It's going to take awhile for another chip class or test card to match these results. Until then, EVGA and Nvidia will lead our pack of high-end graphics cards.

Our water cooling setup also enables great control over operating temperatures. At idle, the card runs at 39 degree Celsius. At heavy extended 3D load levels, those temperatures climb to a modest 68 degrees Celsius for the GPU core. The cooling system circulates 500 cc of liquid, and the seven volt fans spin relatively slowly. Power consumption also offers some good news: 188 W in 2D mode and 532 W in 3D mode (measured from the wall socket for the entire system). Also, the card clocks down to 300/100 MHz (GPU/graphics RAM) in desktop mode.

The graphics chip supports DirectX 10, PhysX, and CUDA. Both of its PCBs measure 10.4" (26.5 cm) long. Providing ample power necessitates attaching one six-pin and one eight-pin PCIe auxiliary connector, also available on the edge of the board. Thanks to a compact design (relatively), the card covers two expansion slots, just like the reference design and its cooler. Bundled accessories include a cable splitter for power, an HDMI port on the card itself, and an S/PDIF cable for HDMI audio.

As mentioned, there is no water-cooling gear (hoses, connectors, pump, and so forth) included. The cooler on the graphics card is, however, anodized both inside and out. The copper contacts on the GPUs are also nickel-plated to limit corrosion. For long-term use, EVGA recommends a non-conductive coolant to block corrosion and algae growth.

Display all 111 comments.
This thread is closed for comments
Top Comments
  • 23 Hide
    Anonymous , May 22, 2009 6:24 AM
    Only one ATi card? What happened to all those OC'd 4890s?
  • 16 Hide
    Anonymous , May 22, 2009 7:42 AM
    Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
  • 10 Hide
    curnel_D , May 22, 2009 6:57 AM
    Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
Other Comments
  • 23 Hide
    Anonymous , May 22, 2009 6:24 AM
    Only one ATi card? What happened to all those OC'd 4890s?
  • 8 Hide
    Anonymous , May 22, 2009 6:27 AM
    And those HAWX benchmarks look ridiculous. ATi should wipe floor with nvidia with that. Of course you didn't put dx10.1 support on. Bastard...
  • 1 Hide
    cangelini , May 22, 2009 6:35 AM
    quarzOnly one ATi card? What happened to all those OC'd 4890s?


    These are the same boards that were included in the recent charts update, and are largely contingent on what vendors submit for evaluation. We have a review upcoming comparing Sapphire's new 1 GHz Radeon HD 4890 versus the stock 4890. It'll be up in the next couple of weeks, though.
  • 4 Hide
    ohim , May 22, 2009 6:52 AM
    Am i the only one that find this article akward since looking at the tests done on Ati cards on The Last Remnant game makes me wonder what went wrong ... i mean it`s UT3 engine ... why so low performance ?
  • 10 Hide
    curnel_D , May 22, 2009 6:57 AM
    Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
  • 16 Hide
    Anonymous , May 22, 2009 7:42 AM
    Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
  • 4 Hide
    EQPlayer , May 22, 2009 7:47 AM
    Seems that the cumulative benchmark graphs are going to be a bit skewed if The Last Remnant results are included in there... it's fairly obvious something odd is going on looking at the numbers for that game.
  • 9 Hide
    armistitiu , May 22, 2009 7:48 AM
    Worst article in a long time. Why compare how old games perform on NVIDIA's high end graphic cards? Don't get me wrong i like them but where's all the Atomic stuff from Saphire, Asus and XFX had some good stuff from ATI too. So what.. you just took the reference cards from ATI and tested them? :| That is just wrong.
  • -7 Hide
    pulasky , May 22, 2009 8:00 AM
    WOW what a piece of s********** is this """"""review"""""" Noobidia pay good in this days.
  • -1 Hide
    darkpower45 , May 22, 2009 8:00 AM
    ok i tried playing The Last Remnant on my comp with my 4870x2 and it failed hardcore >.< the game itself is ridiculously boring too. sooo why is it added to the benching list?? *shakes head* makes me sad...
  • 1 Hide
    guusdekler , May 22, 2009 8:02 AM
    I find it a lack this tests do not include the 3DMark Vantage suite.
    Ok, there aren't many games using DX10, but some very good ones do !.

    Thats the reason i've switched to vista.

    And with me enough people to justify a proper DX10 benchmark.
  • -6 Hide
    Luscious , May 22, 2009 8:07 AM
    No mention of the GTX 285 2GB version? I'm planning on picking up three of these for a tri-SLI Core i7 build, all water-cooled and overclocked.
  • -9 Hide
    Ellimist , May 22, 2009 8:19 AM
    well i'm running an factory overclocked gtx285. only because i like solid drivers and DAAMIT doesn't seem to be able to provide these consistently. thats been my biggest problem in picking up an ATI card.

    This review however is terrible. the benchmark selection is dated if nothing else. even toms other reviews of recent have used better benchmarks than this.
  • 10 Hide
    sosofm , May 22, 2009 8:24 AM
    This benchmark is not fair for Ati !!!
  • 3 Hide
    IronRyan21 , May 22, 2009 9:00 AM
    Lets see some 3dmarkVantage pls
  • 0 Hide
    drealar , May 22, 2009 9:18 AM
    Like car review magazine (like the one my friend is working for), I THINK they only have cards that were submitted to them and (not sure if this is the case with Tom's) they're only lended for a limited amount of days.

    Although I'm not very satisfied (coz lack of ATI card in your possession), I thank you for the review with Fallout, Left 4 Dead and Last Remnant with DX9. Yup I'm still using XP coz the bog-down symptom with Vista is too noticeable for my rig.

    1920x1200 as minimum threshold? Cool, as my 23" is limted to 1920x1080 anyway :p 
  • 1 Hide
    JeanLuc , May 22, 2009 10:09 AM
    I think you guys should cut Tino Kreiss some slack this I believe his first publication? Saying things like "this is the worst article I've read in a long time" doesn't actually help. You can blame the choice of benchmarks suites on sites manager/editor not the author as he only does what he is told to write. So with that in mind............Cangelini your fired.

    I am curious though, HAWX is a game sponsored by ATI so why is the HD4890 getting it's backside tanned by the GTX275? It's not just a few FPS behind either the difference is quite remarkable and yes I do realise the BFG GTX275 is overclocked but it's not overclocked by a lot.
  • 0 Hide
    cangelini , May 22, 2009 10:16 AM
    JeanLucCangelini your fired.


    Perhaps you'll hire me as a copy editor for your posts instead? ;-)

    In all seriousness, Tino has been with Tom's German office for a long time. I've asked the staff responsible for testing there to drop in and provide some feedback on the products and benchmarks used here.

    Best,
    Chris
  • 9 Hide
    linaaslt , May 22, 2009 10:25 AM
    How lame this article is... i was always wondering why they don't use full potential of gpu, if ATI is capable of using DX10.1 (and game uses that technology), why not use it. it might not be fair for nvidia but ffs, i believe that this kind of review should show all potential of products.
    Shame for TH!
Display more comments