Sign in with
Sign up | Sign in

GeForce GTX 260 OC

The Fastest 3D Cards Go Head-To-Head

The 75 mm fan generates 54 dB(A) under full load.

The GeForce GTX 260 is fitted with 896 MB of GDDR3 memory (on a 448-bit bus) and supports DirectX 10. The default clock rates are 576 MHz for the GPU, 1,242 MHz for the shaders, and 1,998 MHz for the memory. On our sample, MSI overclocked those frequencies to 620, 1,296 and 2,160 MHz, respectively. The best gain is seen in Mass Effect (UT3 Engine) at 1920x1200 pixels with anti-aliasing enabled—the overclocked values yield a frame rate increase of 14.2%. If you take the average of all the games included in the benchmark suite, the gain is 4.5%, which halves the gap between GeForce GTX 260 and a normally-clocked GTX 280.

In 3D performance, the GTX 260 puts up a tough fight against AMD’s Radeon HD 4870. At 1280x1024, the GTX 260 is actually better, while at 1680x1050 pixels (without anti-aliasing) the HD 4870 wins by 1.4%. With anti-aliasing enabled, the GTX 260 is 10% faster. At 1920x1200 without AA, the HD 4870 wins by a couple of frames per second, but with anti-aliasing turned on, the GTX 260 is 6% faster. Here the fast GDDR5 memory on the Radeon card starts to make itself felt. The considerable drop in price to $290, makes the GTX 260 a good alternative to the slightly weaker and now comparably-priced AMD Radeon HD 4870.

Unusually, the noise the GTX 260 generates while sitting on the Windows desktop is reasonable, in the neighborhood of 38.1 dB(A). After testing in 3D mode, the fan couldn’t make up its mind. The temperature dropped to 45 degrees C, but the speed didn’t change. The card was running without load, but we could still hear up to 44.2 dB(A). The speed re-adjustment only occurred in the test with the X38-based motherboard. It did not manifest itself on the 780i-based board.

The power consumption of the GTX 260 in 2D mode is considerably lower than either of AMD’s offerings. As soon as the GTX 260 comes out of 3D mode, it switches to its low-power 3D profile (GPU at 400 MHz, shaders at 800 MHz, and memory at 600 MHz), where it draws 125 watts of power for the entire system. After a few more seconds of idle, the clock rate is switched into 2D mode (GPU at 300 MHz, shaders at 600 MHz, and memory at 200 MHz). Overall consumption falls to 111 watts. Under 3D full load, the GeForce GTX 260 consumes 336 watts. A solid power supply with 280 to 320 watts of overall power and 23 to 27 A on the 12 volt rail should be sufficient here.

Our GTX 260 sample is the OC model from MSI.MSI's bundle includes the Colin McRae Dirt racing game.

There is an internal SPDIF connection for sound via the HDMI adapter.Two PCIe connections, each with six pins, handle power delivery.

The SLI connections are hidden under a cover.Three graphics cards can be joined using two SLI connections.

The fan is two slots high, and exhaust air is expelled from the PC case.The card is almost 11” (27) cm in length; the two power connections are at the sides.

The I/O panel has one video and two DVI outputs.VGA and HDMI adapters are supplied by MSI.

Display all 146 comments.
This thread is closed for comments
Top Comments
  • 20 Hide
    elbert , August 29, 2008 10:55 AM
    Version AMD Catalyst 8.6? Why not just say i'm using ATI drivers with little to no optimizations for the 4800's. This is why the CF benchmarks tanked.
  • 19 Hide
    wahdangun , August 29, 2008 11:07 AM
    WTF, hd4850 SHOULD be a lot faster than 9600 GT and 8800 GT even tough they have 1Gig of ram
  • 16 Hide
    mjam , August 29, 2008 11:09 AM
    No 4870X2 and 1920 X 1200 max resolution tested. How about finishing the good start of an article with the rest of it...
Other Comments
  • -4 Hide
    San Pedro , August 29, 2008 10:14 AM
    Looks like the results for SLI and Crossfire were switched with the single card results. . .
  • 14 Hide
    Duncan NZ , August 29, 2008 10:40 AM
    Not a bad article, really comprehensive.
    My one complaint? Why use that CPU when you know that the test cards are going to max it out? Why not a quad core OC'ed to 4GHz? It'd give far more meaning to the SLI results. We don't want results that we can duplicate at home, we want results that show what these cards can do. Its a GPU card comparason, not a complain about not having a powerful enough CPU story.

    Oh? And please get a native english speaker to give it the once over for spelling and grammar errors, although this one had far less then many articles posted lately.
  • 14 Hide
    elbert , August 29, 2008 10:50 AM
    No 4870x2 in CF so its the worlds top end Nvidia vs ATI mid to low end.
  • 15 Hide
    Lightnix , August 29, 2008 10:51 AM
    It'd be a good article if you'd used a powerful enough CPU and up to date Radeon drivers (considering we're now up to 8.8 now), I mean are those even the 'hotfix' 8.6's or just the vanilla drivers?
  • 20 Hide
    elbert , August 29, 2008 10:55 AM
    Version AMD Catalyst 8.6? Why not just say i'm using ATI drivers with little to no optimizations for the 4800's. This is why the CF benchmarks tanked.
  • 9 Hide
    Anonymous , August 29, 2008 10:57 AM
    at 1280, all of the highend cards were CPU limited. at that resolution, you need a 3.2-3.4 c2d to feed a 3870... this article had so much potential, and yet... so much work, so much testing, fast for nothing, because most of the results are very cpu limited (except 1920@AA).
  • 19 Hide
    wahdangun , August 29, 2008 11:07 AM
    WTF, hd4850 SHOULD be a lot faster than 9600 GT and 8800 GT even tough they have 1Gig of ram
  • 16 Hide
    mjam , August 29, 2008 11:09 AM
    No 4870X2 and 1920 X 1200 max resolution tested. How about finishing the good start of an article with the rest of it...
  • 15 Hide
    Anonymous , August 29, 2008 11:50 AM
    I agree, the 4870 X2 should have been in there and should have used the updated drivers. Good article but I think you fell short on finishing it.
  • -8 Hide
    Anonymous , August 29, 2008 11:59 AM
    @pulasky - Rage much? It's called driver issues you dumbass. Some games are more optimised for multicard setups than others, and even then some favour SLi to Crossfire. And if you actually READ the article rather than let your shrinken libido get the better of you, you'll find that Crossfire does indeed work in CoD4.

    Remember, the more you know.
  • 7 Hide
    buzzlightbeer , August 29, 2008 12:03 PM
    isnt forceware 177.41 out for gt200 series? so they are using a recent driver for the nvidia cards yet not for the ATI yes would have to agree with wahdangun the 4850 is alot faster then the 9600gt and the 8800gt i have 2 friends with both cards with q6600s one at 3.2 (9600gt) and the other at 3.0 (4850) and the 4850 machine destroys the other one even with a lower clocked cpu
    but yes the article was off to a great start, maybe throw some vantage in there as well?
  • 15 Hide
    chesterman , August 29, 2008 12:06 PM
    agree with the others. u guys should use a more recent driver for ati/amd cards, use a more game-effective cpu and REALLY should have put the 4870x2 on the fight
  • 11 Hide
    masterwhitman , August 29, 2008 12:09 PM
    elbertVersion AMD Catalyst 8.6? Why not just say i'm using ATI drivers with little to no optimizations for the 4800's. This is why the CF benchmarks tanked.

    Precisely; several other websites tested with 8.7 and 8.8 long before this article was published. Why couldn't you? Look at the 8.6 release notes; it doesn't even mention the HD4000 series cards as supported devices.

    Brilliant guys.
  • 0 Hide
    roynaldi , August 29, 2008 12:27 PM
    NVISION comes around and IRONicallY, a 36 page article is produced that is magically in favor of, whats that, NVIDIA!!!

    After having the Mythbusters appear, you would think this would be the most comprehensive, "scientific," factual, and update article meeting Tom's usual standards.... I didn't finish reading this.
  • 10 Hide
    xrodney , August 29, 2008 12:47 PM
    Using old drivers with no optimalisation at all fo newest card whitch was released months ago seems too strange to me. Also temperature results for 48xx are quite oposite reality, at least when compare to 8.8 catalyst.
    (82 temperature in 2D 69 in 3D with no fanfix)
  • 2 Hide
    jitpublisher , August 29, 2008 1:00 PM
    Pretty good, finally. Wish you would have have used an overclocked Quad so the newer GPU's could show their full potentianl, and you really should have used the latest drivers, but I give this article 2 thumbs up. Lot of good information in here.
  • 3 Hide
    Haiku214 , August 29, 2008 1:13 PM
    Well the main reason why they don't have the 4870x2 and the latest drivers is simply because they made this article a couple of weeks ago. If you could just imagine how long and tedious it is to produce all these data and results. It's just sad that after finally finishing the article, a lot of new stuff has already happened(new drivers and the x2).
  • 6 Hide
    jameskangster , August 29, 2008 1:19 PM
    First I want to say that the article itself is not bad at all.
    Also, I can understand why TH didn't have time to use 8.8 since it was released publicly on August 20, 2008 (Although ATI would have gladly released a beta version to TH for testing purposes).

    However, AMD publicly released stable Catalyst 8.7(internal version 8.512) on July 21, 2008. That's more than a month ago. It has numerous improvements (for example, CF performance increase, improved stability and performance under Vista). To be honest, most of the improvements range from 4% to 15%. (In CF case, up to 1.7 X scaling)

    TH has rarely been unfair and/or inaccurate and they always owned up to their mistakes before, and I trust them to re-test ATI products with at least 8.7 if not 8.8 to continue to uphold their values and integrity.
  • 7 Hide
    outlw6669 , August 29, 2008 1:20 PM
    So, to start off with, this article is much better than many of the other recent reviews. I feel you put some thought into it and for the most part it is good. I found the comparative performance charts at the end interesting. Have you thought of changing the GPU charts in a similar fashion?

    Now on to my criticism.

    I can understand how you want to keep the results homogeneous with previous results but if you already know that a stock QX6800 will bottleneck the system, be proactive in fixing it. At the very least you should have done a small segment of the review showing the newer cards with a quad core overclocked to 4.0Ghz.

    Also, if you have ever read any of the older Toms articles, you would know that you can still minimise the bottleneck from a slow GPU bye raising the resolution. Perhaps you should test the fastest cards at the highest resolutions?

    I can also understand why you did not use the latest nVidia drivers. It takes time to create a review of this scale and the GF8/9 series drivers have been stable for some time. As the GT 200 series brings no new features to the table, they would needed little optimisation for their newer cards allowing the slightly dated drivers to perform nicely.

    What I can not understand is why you would use ATI's 8.6 drivers??
    The 8.7 drivers have been out for more than a month bringing quite a few fixes/optimisations with it. I understand it probably took more than 9 days to complete all of these benchmarks (today is the 29th, the 8.8 drivers were officially released on the 20th) but you should have called ATI and asked for their latest drivers. The 8.8 drivers were leaked at least a week before the official release which means, if you could nurture a relationship with the people you review, they could/probably would have provided them to you. There is still no excuse I can see for testing with the old 8.6 drivers. Seriously, it does not even have official support for the 48X0 cards...

    From the title of the article,"The Fastest 3D Cards Go Head-To-Head", I would have assumed that you would have been testing the Fastest 3D cards? What happened to your 4870x2? As you have already attempted to review it, we know you have your hands on one. How can you claim to review the "Fastest 3D Cards" and still leave out the fastest card?

    In summation, I liked many things from this article. The layout was nice and a little more technical than we have been seeing as of late. I enjoyed the comparison charts at the end and I think you should adopt a similar method for the CPU and GPU charts. I would have thought this was an excellent and well thought out article if it had not been for the glaring and obvious deficiencies in reason. I give you credit for stepping Toms in the right direction. With a little more unbiased comparison, critical thinking and common sense I could come to see reviews such as this in a very positive light.
Display more comments