High-End Graphics Card Roundup

Zotac GTX285 AMP Edition (GeForce GTX 285, 1,024 MB)

To see all pictures, please click on the photo of the test card below to access our photo gallery.

The fastest GeForce GTX 285 in our test field comes from Zotac. Its AMP edition only has two current competitors in its league: the GeForce GTX 295 and the Radeon HD 4870 X2. Because both of these cards are dual-GPU models, we can observe without question or controversy that the Zotax 285 AMP Edition is the fastest single-chip graphics card available on the market today.

The Zotac model earns its position through overclocking (the abbreviation AMP indicates "amplified clock speeds"). Standard frequencies come in at 648 MHz for the GPU, 1,476 MHz for the shaders, and 2 x 1,242 MHz for the graphics RAM. Zotac raises these rates to 702 MHz (GPU), 1,512 MHz (shaders), and 2 x 1,296 MHz (memory). Overall performance considered, this catapults the AMP card past the MSI GeForce GTX 285 SuperPipe by about 2.6%. In comparison to the reference GeForce GTX 285, this is a performance boost of 4.8% due to aggressive clocks.

Of course, the gains aren't free; rather, they result in higher noise levels. In 2D mode, the Nvidia reference design remains quiet at 37.9 dB(A), but under heavy load, noise levels climb to a noticeably audible 51.4 dB(A). This is a pity, because the 55 nm GeForce GTX 260 runs at 41.2 dB(A) and the old 65 nm GeForce GTX 280 is measured at 45.4 dB(A). The reference fans for the GeForce GTX 285 are more like those for the original GeForce GTX 260, at 53.8 dB(A). It goes without saying, then, that the fans on MSI's SuperPipe were much quieter.

The graphics chip supports DirectX 10, PhysX, and CUDA. The card's PCB is 10.6" (27 cm) long, and it runs at 300 MHz/100 MHz (GPU/graphics RAM) in desktop mode. The board requires two six-pin PCIe power connectors, both of which are mounted on the rear edge. This design covers two motherboard slots, just like most of the other high-end offerings in our roundup. The retail package includes Racedriver GRID, two power-splitter cables, an HDMI adapter, an S/PDIF cable, and a copy  of 3DMark Vantage Advanced. Finally, you'll find two dual-link DVI ports and a video output on the I/O bracket.

  • Only one ATi card? What happened to all those OC'd 4890s?
    Reply
  • And those HAWX benchmarks look ridiculous. ATi should wipe floor with nvidia with that. Of course you didn't put dx10.1 support on. Bastard...
    Reply
  • cangelini
    quarzOnly one ATi card? What happened to all those OC'd 4890s?
    These are the same boards that were included in the recent charts update, and are largely contingent on what vendors submit for evaluation. We have a review upcoming comparing Sapphire's new 1 GHz Radeon HD 4890 versus the stock 4890. It'll be up in the next couple of weeks, though.
    Reply
  • ohim
    Am i the only one that find this article akward since looking at the tests done on Ati cards on The Last Remnant game makes me wonder what went wrong ... i mean it`s UT3 engine ... why so low performance ?
    Reply
  • curnel_D
    Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
    Reply
  • anamaniac
    The EVGA 295 is the stuff gods game with.

    I would love that card. I would have to replace my whole system to work it properly however.
    I want $1500 now... i7 920 (why get better? They all seem to be godly overclockers) and EVGA 295.

    How about a test suit of the EVGA GTX 295 in crossfire for a quad-gpu configuration? I know there's driver issues, but it would be fun to see what it could do regardless. Along with seeing how far Toms can OC the EVGA GTX 295.
    Actually... Toms just needs to do a new system building recommendation roundup. I find them useful personally, and would have used it myself had my cash source had not lost his job...
    Reply
  • Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
    Reply
  • EQPlayer
    Seems that the cumulative benchmark graphs are going to be a bit skewed if The Last Remnant results are included in there... it's fairly obvious something odd is going on looking at the numbers for that game.
    Reply
  • armistitiu
    Worst article in a long time. Why compare how old games perform on NVIDIA's high end graphic cards? Don't get me wrong i like them but where's all the Atomic stuff from Saphire, Asus and XFX had some good stuff from ATI too. So what.. you just took the reference cards from ATI and tested them? :| That is just wrong.
    Reply
  • pulasky
    WOW what a piece of s********** is this """"""review"""""" Noobidia pay good in this days.
    Reply