GeForce GTX 295 Performance: Previewed

GeForce GTX 295, Dissected

Initial speculation was that the GeForce GTX 295 would consist of two GT200s in a configuration that’d emulate a pair of GTX 260s. In all actuality, the card boasts a pair of full-strength GT200s with 240 processing cores each and a more GTX 260-like back-end/memory configuration.

The original GT200 was a 1.4 billion transistor behemoth manufactured using TSMC’s 65 nm node. The version of the chip GeForce GTX 295 employs is die-shrunk to 55 nm. As part of the transition, Nvidia’s Jason Paul claims the company has also made silicon timing changes to improve performance per watt, which should manifest themselves in our discussion of power consumption.

Like the GeForce GTX 280, each GPU on the GTX 295 has, as mentioned, 240 SPs and 80 texture address/filtering units. But, like the GTX 260, the same 295’s GPUs include seven ROP/framebuffer partitions to total 28 ROPs and an aggregate 448-bit path to 896 MB of GDDR3. Vital clocks are also in-line with the GeForce GTX 260. The core clock, including texture/ROPs, runs at 576 MHz. The stream processors run at 1,242 MHz. And the memory runs at 999 MHz (1,998 MHz effective). As you can see, each chip is architecturally right between Nvidia’s fastest and second-fastest ASICs.


GeForce GTX 295
GeForce GTX 280
GeForce GTX 260
GeForce 9800 GX2Radeon HD 4870 X2
Manufacturing Process55nm TSMC65nm TSMC65nm TSMC65nm TSMC55nm TSMC
SPs480
240
216
256
1,600
Core Clock576 MHz602 MHz576 MHz600 MHz750 MHz
Shader Clock1,242 MHz1,296 MHz1,242 MHz1,500 MHz750 MHz
Memory Clock1,998 MHz Eff.2,214 MHz Eff.1,998 MHz Eff.2,000 MHz Eff.3,600 MHz Eff.
Frame Buffer1,792 MB Tot.1 GB896 MB1 GB Tot.2 GB Tot.
Memory Bus Width448-bit x 2512-bit448-bit256-bit x 2256-bit x 2
ROPs56 Tot.32
2832 Tot.32 Tot.
Price$499 MSP~$380~$230N/A~$500


2 GPUs, 1 Car
d

At first glance, the GeForce GTX 295 looks like it could be a 280 or 260. When you flip it onto its stomach and look at the PCB on its back side, it’s clear there is only one GPU there. Like the 9800 GX2 and 7950 GX2 that came before it, this board centers on a dual-PCB design that sandwiches a special heatsink/blower combination between two separate graphics boards linked by an SLI cable and encased in a protective shell.

Naturally, the design of the cooler must be adapted to take the pair of PCBs into account, so you’ll see holes cut in both boards for air to be sucked through. The complete card occupies two expansion slots of space, so it isn’t any wider than Nvidia’s single-chip offerings. In fact, it’s also the same length as the GeForce GTX 280 (and AMD’s Radeon HD 4870 X2).

Let’s Talk Power

Nvidia isn’t ready to have the GTX 295’s power consumption plotted against AMD’s solution. However, the card as it sits, is less power-hungry at both idle and load than the Radeon HD 4870 X2. The chart below reflects total system consumption from the wall.

On paper, the GeForce GTX 295 uses up to 289 W TDP on its own. The 4870 X2 has a 286 W TDP. And yet, when we measured total system load at the socket, the GTX 295 idled 10 W lower than the AMD board. While looping the Far Cry 2 benchmark at 2560x1600 with AA and AF cranked up, the Nvidia board averaged a full 50 W lower.

Of course, we’ll have to wait until early January for final fan speeds and power figures. But early in the game, the shift to 55 nm is treating the GeForce GTX 295 well, despite the massive size of its GT200 GPU.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
186 comments
    Your comment
    Top Comments
  • cleeve
    NarwhaleAuYour conclusion was, at best, poor. Nvidia's "fastest single card" is two 280s on a single PCB, selling at the price point that ATI is selling their 4870x2 at right now?It is a lot cheaper to produce the 4870 GPU, so I am sure you will see ATI cut their price down by at least $50, and maybe $100. Nvidia will then have the same problem - a monolithic GPU that is expensive to produce and not really any faster than the 4870.


    Why was it poor? Are you saying the 295 is invalid because nvidia uses two boards on their dual-GPU card and Ati uses a single board?

    Are you also saying Nvidia won't be willing to price match performance, when that's exactly what they've done with their current line-up?

    While it'll likely hurt Nvidia's bottom line more than Ati's to lower pricing, that hasn't stopped them up until now, and doesn't really have an impact on the article's conclusion does it?

    As long as it's readily available at launch, kudos to Nvidia. But Chris' conclusion looks bang on to me. I'm not sure what part of it you have a problem with.
    18
  • Tindytim
    First!?

    Why do I get the feeling AMD is already working on something to bust Nvidia again?
    12
  • trainreks
    good to see that nvidia whipped back into submission. Their prices were ridiculous when they were on the top for a long time.
    11
  • Other Comments
  • titdoctor
    wait until ATI updates their drivers again. 4870x2 FTW
    -4
  • Tindytim
    First!?

    Why do I get the feeling AMD is already working on something to bust Nvidia again?
    12
  • cangelini
    8.12 was definitely a nice update!
    6
  • NarwhaleAu
    Your conclusion was, at best, poor.

    Nvidia's "fastest single card" is two 280s on a single PCB, selling at the price point that ATI is selling their 4870x2 at right now?

    It is a lot cheaper to produce the 4870 GPU, so I am sure you will see ATI cut their price down by at least $50, and maybe $100. Nvidia will then have the same problem - a monolithic GPU that is expensive to produce and not really any faster than the 4870.
    -3
  • xsane
    I totally agree with him on the Physx and CUDA comment. It would be really nice to have a game like Tiger Woods support Physx.

    I have 2 x 4850 in crossfire, it kicks ass.
    0
  • trainreks
    good to see that nvidia whipped back into submission. Their prices were ridiculous when they were on the top for a long time.
    11
  • malveaux
    NarwhaleAU:

    You clearly need to re-read this article.
    And cutting prices $50? $100? Yea, born yesterday? Not happening.

    @Article

    Thanks for the preview! I've been looking out for the GTX295 to surface. Two GTX260's should perform right on par with the thing, and I was wondering what the price would turn out to be. You can get GTX260's for $219 from the Egg right now (or $440 for two). If the GTX295 is only a single card at $499 (likely to be 20 less at the Egg), it's right on the same price area as buying two 260's separately. And in that situation, I'd rather have a single card with the same power. As would most folk I wager. So looks like the 295 is gonna be a real winner in the enthusiast market.

    Very best,
    -7
  • JAYDEEJOHN
    Thanks for being open and honest, and mentioning nVidias mandate. It looks as expected, and is a shame we dont have a larger picture of full performance, since nVidia hamstringed you guys. Good to see some competition at the highend
    3
  • sparky2010
    The problem with ATI is that they release good products but give them incomplete/unoptimized drivers.. to see games where the difference between the 4870 and the X2 is almost nil, but the GTX 295 is doing well in it, well, that's no excuse for ATI.. it's too bad though.. i really hope they could just give us good drivers from the beginning, instead of giving us "performance upgrade packages"..

    I hope that their next driver will see more optimization, and then a showdown! CROSSFIRE X vs. QUAD SLI!!!! MUAHAHAHA!

    Bets down please?
    3
  • drysocks
    I'll be impressed if it costs less than the Radeon HD 4870 X2. ~470 atm
    8
  • cleeve
    NarwhaleAuYour conclusion was, at best, poor. Nvidia's "fastest single card" is two 280s on a single PCB, selling at the price point that ATI is selling their 4870x2 at right now?It is a lot cheaper to produce the 4870 GPU, so I am sure you will see ATI cut their price down by at least $50, and maybe $100. Nvidia will then have the same problem - a monolithic GPU that is expensive to produce and not really any faster than the 4870.


    Why was it poor? Are you saying the 295 is invalid because nvidia uses two boards on their dual-GPU card and Ati uses a single board?

    Are you also saying Nvidia won't be willing to price match performance, when that's exactly what they've done with their current line-up?

    While it'll likely hurt Nvidia's bottom line more than Ati's to lower pricing, that hasn't stopped them up until now, and doesn't really have an impact on the article's conclusion does it?

    As long as it's readily available at launch, kudos to Nvidia. But Chris' conclusion looks bang on to me. I'm not sure what part of it you have a problem with.
    18
  • JAYDEEJOHN
    If anyones seen other previews on this, the minimum fps in Crysis is horrible on this card. One could make same claims as to nVidias drivers for this too. The G200 series is 1 driver ahead of ATI, so give it time, just as Im sure nVidia will have the minimum fps cleaned up for Crysis with this card 1 month from now
    1
  • scook9
    SO..........i see nvidia still has not figured out how to use GDDR5?!?! If they could get that worked into the GTX295, i dont think ANYONE would be able to top that for a while.

    Still dissappointed that nvidia has said nothing about GDDR5...Toms already did an article a while ago on how Hynix is now making 1gb DDR5 low latency chips somewhat cheap..whats the hold up Nvidia
    0
  • cleeve
    sparky2010The problem with ATI is that they release good products but give them incomplete/unoptimized drivers


    Drivers seem fine to me. Remember, the 4870 wasn't designed to be as powerful as the GTX280. It was made to be more efficient, cheaper to manufacture, and scalable.

    The 4870 X2 still shows up the GTX280, and that was their goal. The GTX 295 adds more spice to the mix, and kudos to Nvidia, but Ati never claimed to that the 4870 GPU would be the fastest GPU available; they went for cost effective scalability, and that's what they got.

    Not much to do with the drivers.
    8
  • cleeve
    scook9SO..........i see nvidia still has not figured out how to use GDDR5?!?! If they could get that worked into the GTX295, i dont think ANYONE would be able to top that for a while.Still dissappointed that nvidia has said nothing about GDDR5... ...whats the hold up Nvidia


    As I understand it, the ability to use GDDR5 is a design-level decision, and Nvidia's current lineup has been designed some time ago. The 55nm refresh won't involve a major redesign, just a die shrink.

    You can bet Nvidia's next gen products will likely be designed around GDDR5 though.
    7
  • billiardicus
    Great write up. Thanks guys! This is why I visit Tom's hardware everyday.
    5
  • enyceckk101
    wow , this card doesn't consumption any power when u playing far cry 2 ?

    Im getting this card !
    -5
  • JAYDEEJOHN
    Have to agree, this preview is nice, open and honest. Nicely done fellas. Its an open ended preview, not the review, and its done so in that form
    4
  • sparky2010
    CleeveDrivers seem fine to me. Remember, the 4870 wasn't designed to be as powerful as the GTX280. It was made to be more efficient, cheaper to manufacture, and scalable.The 4870 X2 still shows up the GTX280, and that was their goal. The GTX 295 adds more spice to the mix, and kudos to Nvidia, but Ati never claimed to that the 4870 GPU would be the fastest GPU available; they went for cost effective scalability, and that's what they got.Not much to do with the drivers.


    True, but i'm comparing the 4870 to its big brother, the X2.. there are games where the difference is almost nothing.. so basically multi-gpu optimization for some games is, well, there isn't any... I know that the GTX 280 is superior to the 4870.. but i think that it's a shame that ATI could get better numbers from their cards but instead refuse to put the effort.. instead of their 6 month old card being already 100%, nVidia has a card that's not even released yet, on a new die, that has beta drivers that seem to function alot better than ATI's, that's all.. Yet we receive drivers from ATI every now and then that "unlock" more performance.. i don't know, it's like buying a car and the dealer telling you "Hey! Come back after 10k and i'll give you 20 more HP! XD" lol..
    1
  • JAYDEEJOHN
    Read my link. The minimum fps sucks on this card, the 295. Its a driver issue. It happens, even when youre working with devs early on in the dev of a new game, like nVidia does with its TWIMTBP program
    0