Nvidia GeForce GTX 260/280 Review

Bottom Line

What’s our final impression of the new cards? First of all, the parallel with the GeForce 7800 GTX is obvious. They build on an already-proven architecture by improving the few weak points that had been identified and significantly boosting the processing power. So there are no unpleasant surprises as far as the architecture is concerned, with the possible exception of the absence of support for Direct3D 10.1 or the slightly disappointing double-precision floating-point performance.

On the other hand, unlike the current situation, at the time of the 7800 GTX Nvidia hadn’t put their SLI to work with a bi-GPU card, like the 9800 GX2. That’s the drawback of this launch, since despite raw processing power that’s nearly doubled compared to the earlier generation (even under actual conditions, thanks to the increase in efficiency, and even though some of our synthetic tests didn’t go off the charts as we might have hoped), the GTX 280 didn’t perform well against the 9800 GX2, which beat it fairly regularly in gaming tests. And though it would be crazy to prefer the earlier card – which has only half the useful memory and much higher power consumption at idle, to cite only two of its now-obvious disadvantages – its results necessarily take some of the glory away from the new ultra-high-end card. Still it would be hard to fault Nvidia on this, since it just seems inconceivable to squeeze even one more ALU onto the gigantic die, and when you compare it to the competition’s performance.

And there are several other slightly regrettable points – first among which is the very high noise level, which is hard to explain for the GTX 280 and the 260, when their power consumption is lower than that of bi-GPU cards under load and extremely low at idle. And don’t forget the lack of support for DirectX 10.1, which obviously is a political choice and one that will obviously slow or even prevent adoption by developers, which is reprehensible in the light of the Assassin’s Creed affair. And the price of the GTX 280 ($650, available starting tomorrow), that puts in the “high-high-end” category, is problematic too in the light of the very aggressive price point of the “little” GTX 260 – with 18% less performance on average than its big sister, it’s announced at a price that’s almost cut in half, at $400! The result is that its availability date, announced for June 26, is likely to be particularly tense, given that it has the same GT200 GPU.

Finally, we can’t close without mentioning the very concrete perspectives for the application of CUDA. While a year and a half ago it was unlikely anybody would have pointed to CUDA as one of the positive aspects of the GeForce 8, all that has now changed, with the first three concrete applications ready or nearly ready for use. We’re talking about the BadaBOOM video transcoder and the Folding@Home GeForce beta client, both of which leave the CPU and Radeons in the dust, but also GeForce PhysX support, which has enabled a lot more developers – true, it wasn’t all that hard – to announce support for the technology in their next games, even if it remains to be seen what difference the implementation will make. This considerably widens the area of application of CUDA-compatible GeForce GPUs (from GeForce 8 on), if the release of optimized software with wider appeal continues and if AMD doesn’t manage to jump onto the stage.

Nvidia GeForce GTX 280
The new very-high-end GTX 280 from Nvidia ($650) suffers a little from comparison with the 9800 GX2, which regularly bested its performance in tests, despite the inherent and irremediable drawbacks of bi-GPU cards. But in reality, the real threat is from the card’s “little sister,” the GTX 260, especially since the price will almost buy you two GTX 260s to run in SLI!
  • Pros
  • Cons
    • Improved GeForce 8 architecture
    • Overall performance
    • Very low power consumption at idle
    • Accelerated CUDA software
    • No support for DirectX 10.1
    • Performance compared to the 9800 GX2
    • Noise
    • Price compared to Nvidia competitors

Nvidia GeForce GTX 260
Much more attractive thanks to its very attractive price, the GTX 260 placed consistently high in gaming tests and has most of the advantages of the GTX 280 without the high price. If it’s really available starting next week at $400, it’s sure to be a best-seller.
  • Pros
  • Cons
    • Improved GeForce 8 architecture
    • Performance only 18% under the GTX 280’s
    • Very low power consumption at idle
    • Accelerated CUDA software
    • No support for DirectX 10.1
    • Noise

+ We give the GTX 260 our Recommended Buy Award, because it provides the best bang for the buck! Its performance is excellent, while its price is 45% lower than the GTX 280.

recommended buy logo

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • BadMannerKorea
  • Lunarion
    what a POS, the 9800gx2 is $150+ cheaper and performs just about the same. Let's hope the new ATI cards coming actually make a difference
  • foxhound009
    woow,.... that's the new "high end" gpu????
    lolz.. 3870 x2 wil get cheaper... and nvidia gtx200 lies on the shelves providing space for dust........
    (I really expectede mmore from this one... :/ )
  • thatguy2001
    Pretty disappointing. And here I was thinking that the gtx 280 was supposed to put the 9800gx2 to shame. Not too good.
  • cappster
    Both cards are priced out of my price range. Mainstream decently priced cards sell better than the extreme high priced cards. I think Nvidia is going to lose this round of "next gen" cards and price to performance ratio to ATI. I am a fan of whichever company will provide a nice performing card at a decent price (sub 300 dollars).
  • njalterio
    Very disappointing, and I had to laugh when they compared the prices for the GTX 260 and the GTX 280, $450 and $600, calling the GTX 260 "nearly half the price" of the GTX 280. Way to fail at math. lol.
  • NarwhaleAu
    It is going to get owned by the 4870x2. In some cases the 3870x2 was quicker - not many, but we are talking 640 shaders total vs. 1600 total for the 4870x2.
  • MooseMuffin
    Loud, power hungry, expensive and not a huge performance improvement. Nice job nvidia.
  • compy386
    This should be great news for AMD. The 4870 is rumored to come in at 40% above the 9800GTX so that would put it at about the 260GTX range. At $300 it would be a much better value. Plus AMD was expecting to price it in the $200s so even if it hits low, AMD can lower the price and make some money.
  • vochtige
    i think i'll get a 8800ultra. i'll be safe for the next 5 generations of nvidia! try harder nv crew
  • cah027
    Looks like ATi might have a fighting chance of catching up to Nvidia. Hopefully this will help AMD out as a company.
  • Anonymous
    I am fairly Dissapointed I thought Nvidia would go for the High end Market with great performance and a lot of money, but it's only a lot of money
  • baracubra
    Finally!!! woot
  • sailormonz
    I don't believe quad sli with the 9800GX2 works too well, therefore these cards may be best suited for a person with tons of money to waste and wanting a SLI system with top of the line cards. These results were rather disappointing however.
  • Annisman
    My 9800GX2 looks like it's gonna be staying in my case for at least another month or two. (4870x2 anyone?)
  • RaZZ3R
    what a pice of s***, nVidia what hapend to you, are you going the same path like AMD ???. and about the review: wher is the info about the integrated Ageia PhsyX in the GTX280 and 260 ... more info god dame it, and dont tell me about CUDA because that is software. I want hardware info and capabilities and some screen shot for god sake.
  • neodude007
    Boooo I want more power.
  • mr roboto
    I've had my 8800GTX for almost a year and a half and it still owns. My card was $499 when I bought it and was the most expensive card I'll ever buy. However, it's looking more and more like a great investment. This GTX280 is disappointing to say the least. I would love to see ATI jump back in the game!

    What's funny is they might actually compete with this card with out even meaning to.
  • zarksentinel
    this card is not worth the money.
  • dragoncyber
    Dear Nvidia,

    I am obviously not the first one to state that they are entirely angered by the results of the recent GTX280 benchmarks. I have been an Nvidia customer since the original Geforce series, always trusting the reliable green team to be at the forefront of the graphics race.

    Instead this time I am completely thrown for a loop as Nvidia expects us to pay 600.00-650.00 for a (New Generation) graphics card release that performs in most cases lower than a 9800GX2 which at the time of this report is actually 150-200 dollars cheaper,and once released will most likely drop even further.

    Not to mention the 9800GTX which is basically on par with an 8800 ultra is down below 300.00!! Tri-SLi anyone?? Three 9800GTX's would out perform "Quad Sli 9800GX2's" and this has already been proven on several sites (AnandTech..Hello???)WTF is wrong with Nvidia?? Tri has already been posting great numbers!!

    So obviously knowing this going into a release you would want to put something on the table that would blow the doors off your current best configurations. Instead we are handed mere 15-20% gains in some situations, and actually being beat in others. And of course they didn't post SLI tests here for this report.

    The next thing that really gets my goat is selling a GTX260 card for 200-250 dollars less, and it has only an 18-20% decrease in performance as compared to your new shiny Top-Of-The Line. With a little over clocking and some good cooling it's the same card in performance tets. What gives??

    Now the biggest embarassment for Nvidia is that they are pushing CUDA technology and folding at home client CRAP!! These cards are designed for gaming!! Who gives a crap about how fast they can cure cancer or render 3D medical images and scans of the human body. Maybe 10% of the cards produced will be used for this purpose..the rest will be for GAMING!! What in the hell are we even talking about that junk for in this article?? Does that even matter when really what everyone cares about is will it beat the crap out of Crysis!!?? Will it Provide me solid gaming for the next 2 years?? Is it worth my hard earned money??

    So far I am in AWE...yes..but not the good AWE, the bad one, the I can't believe this is happening here comes the BIG RED DRAGON(ATI)breathing fire and brimstone , I'm a scared little peseant in a village AWE!!
    I am this close to throwing away my 790i board, putting my 8800GT SLi'd on Ebay, and switching over to a crossfire platform.

    Again I cannot state again how utterly disappointed I am at this turn of events, and the worst part is that there will be no way they could do a GTX280 X2 card because the manufacturing process and die size are too huge, and too hot to combine them. They would literally have to redesign the entire thing, basically doing a new release all over again.

    I'm afraid to say it but I think this will be like one of those boxing matches where, the underdog is in the corner getting punched on left and right...Then something clicks, and he decides enough is enough, and he throws a BIG left that hits the other guy right in the jaw. He doesnt see it coming, and has he hits the mat he thinks.."This guy still has some fight in him". Personally I hope ATI comes out on top with the 4870 X2. That will make Nvidia realize thay can't sit around in their offices all year playing Nerf Hoops, and living off the fat of their successes over the last 4 years.They have gotten fat and lazy, and this card shows it all over the place.

    I am however going to wait, and see if the driver improvements make any difference at all in the upcoming weeks to a couple months. I have a 8800GT Sli system, and I'm sitting pretty as far as I am concerned. I Still consider the 8800 GT 512mb to be one of the best graphics cards ever made, and a combo of them is hard to beat. So I will wait and see what happens.

    Best of Luck to any who choose to buy it upon release, I think soon you will wish you had waited just a bit longer. the price will drop drastically once all the Tech sites get there reviews out.