Nvidia's GeForce GTX 285: A Worthy Successor?

Faster And Cheaper

A manufacturing process shrink is a lot to get excited about for both vendors and their customers, as the change normally leads to increased performance and efficiency, while reducing production cost by increasing the number of units produced per wafer. But the eventual benefit to design firms often comes at a huge up-front cost, since significant alterations normally result in bugs that need to be fixed before a full production run can begin. Each test run costs hundreds of thousands of dollars, so it pays to get things right the first or second time. Production delays are an even costlier problem when new architecture is involved, which is why Nvidia normally updates its current products before introducing any new ones.

Today’s updated product, the GeForce GTX 285, makes the typical promises of improved performance and efficiency compared to the GeForce GTX 280 on which it is based. Let’s take a quick look at how it compares to other high-end solutions.

Swipe to scroll horizontally
 GeForce GTX 285GeForce GTX 280GeForce GTX 260GeForce GTX 295Radeon HD 4870 X2
Manufacturing Process55 nm TSMC65 nm TSMC65 nm TSMC55 nm TSMC55 nm TSMC
SPs240240216480 Total1,600
Core Clock648 MHz602 MHz576 MHz576 MHz750 MHz
Shader Clock1,476 MHz1,296 MHz1,242 MHz1,242 MHz750 MHz
Memory Data Rate2,484 MHz2,214 MHz1,998 MHz1,998 MHz3,600 MHz
Frame Buffer1 GB1 GB896 MB1,792 MB Tot.2 GB Tot.
Memory Bus Width512-bit512-bit448-bit448-bit x 2256-bit x 2
ROPs32322856 Total32 Total
Price$380~$340~$260~$500~$430

The GeForce GTX 285’s most noticeable performance-oriented improvement is an increase in GPU clock speed of around 8%. The memory clock increase–while much larger at 12%–is likely not as important for performance. GeForce GTX 295 graphics units get two of these processors, although each one is handicapped with slower GPU speed, memory speed, and memory bus width. Meanwhile, AMD earns bragging right for both GPU clock and memory data rates, but only because less-complex graphics processors typically clock higher and GDDR5 memory uses a quad-data rate bus.

The specific card in today’s review is a special "XXX" sample of XFX’s GeForce GTX 285, model GX-285N-ZDDF, sporting a core clock of 670 MHz and GDDR3-2500. The larger numbers look more impressive than they are, since these are less than 4% GPU and 1% RAM above the reference specification, so we’ll split the difference and consider it a likely 2-3% average improvement over base speed.

In addition to the basics, XFX includes a door tag, the game Far Cry 2, a DVI-to-HDMI adapter, and an S/PDIF breakout cable. The breakout cable connects a motherboard’s internal S/PDIF audio output to an input adjacent to the card’s power connections, and the combined audio/video signal can be accessed though the output of the HDMI adapter. While this method has been available on Nvidia products for several generations, many previous packages did not include the special cable.

Thomas Soderstrom
Thomas Soderstrom is a Senior Staff Editor at Tom's Hardware US. He tests and reviews cases, cooling, memory and motherboards.
  • Proximon
    Perfect. Thank you. I only wished that you could have thrown in a 4870 1GB and a GTX 260+ into the mix, since you had what I'm guessing are new beta drivers. Still, I guess you have to sleep sometime :p
    Reply
  • fayskittles
    I would have liked to see the over clocking that could be done to all cards and see how they compare then.
    Reply
  • hannibal
    It's so good to see competition!
    Reply
  • I would like to see a benchmark between SLI 260 Core 216, SLI 280, SLI 285, GTX 295, and 4870X2
    Reply
  • ravenware
    Thanks for the article.

    Overclocking would be nice to see what the hardware can really do; but I generally don't dabble into overclock video cards. Never seems to work out, either the card is already running hot or the slightest increase in frequency produces artifacts.
    Also driver updates seem to wreak havoc with oc settings.
    Reply
  • wlelandj
    Personally, I'm hoping for a non-crippled GTX 295 using the GTX 285's full specs(^Core Clock, ^Shader Clock, ^Memory Data Rate, ^Frame Buffer, ^Memory Bus Width, and ^ROPs)My & $$$ will be waiting.
    Reply
  • A Stoner
    I went for the GTX 285. I figure it will run cooler, allow higher overclocks, and maybe save energy compared to a GTX 280. I was able to pick mine up for about $350 while most GTX 280 cards are still selling for above $325 without mail in rebates counted. Thus far, over the last three years I have had exactly 0 out of 12 mail in rebates for computer compenents honored.
    Reply
  • A Stoner
    ravenwareThanks for the article.Overclocking would be nice to see what the hardware can really do; but I generally don't dabble into overclock video cards. Never seems to work out, either the card is already running hot or the slightest increase in frequency produces artifacts.Also driver updates seem to wreak havoc with oc settings.I just replaced a 8800 GTS 640MB card with the GTX 285. Base clocks for the GTS are 500 GPU and 800 memory. I foget the shaders, but it is over 1000. I had mine running with 0 glitches for the life of the card at 600 GPU and 1000 memory. Before the overclock the highest temperature at load was about 88C, after the overclock the highest temperature was 94C, both of which were well within manufaturer specifications of 115C. I would not be too scared of overclocking your hardware, unless your warranty is voided because of it.

    I have not overclocked the GTX 285 yet, I am waiting for NiBiToR v4.9 to be released so once I overclock it, I can set it permantly to the final stable clock. I am expecting to be able to hit about 730 GPU, but it could be less.
    Reply
  • daeros
    Because most single-GPU graphics cards buyers would not even consider a more expensive dual-GPU solution, we’ve taken the unprecedented step of arranging today’s charts by performance-per-GPU, rather than absolute performance.

    In other words, no matter how well ATI's strategy of using two smaller, cheaper GPUs in tandem instead of one huge GPU works, you will still be able to say that Nvidia is the best.

    Also, why would most people who are spending $400-$450 on video cards not want a dual-card setup. Most people I know see it as a kind of bragging right, just like water-cooling your rig.

    One last thing, why is it so hard to find reviews of the 4850x2?
    Reply
  • roofus
    because multi-gpu cards come with their own bag of headaches Daeros. you are better off going CF or SLI then to participate it that pay to play experiment.
    Reply