The Fastest 3D Cards Go Head-To-Head

GeForce GTX 280 Superclocked

The 75 mm fan is very audible at 54.7 dB(A) under full load.

The GTX 280 is fitted with 1,024 MB of GDDR3 RAM (on a 512-bit bus) and supports DirectX 10. The overclocked MSI card is available in two versions. The default runs at 602 MHz for the GPU, 1,296 MHz for the shaders and 2,214 MHz for the memory. The overclocked version from MSI uses 650, 1,296 and 2,300 MHz respectively. And the superclocked version we’re testing uses 700, 1,400 and 2,300 MHz clock speeds. The overclocking improves frame rates in Mass Effect (UT3 Engine) at 1920x1200 pixels with anti-aliasing by 16%. If you take the average of all games in the benchmark suite, you get 5.8%—the best value of the tested overclocked models from MSI.

In terms of overall performance, the GeForce GTX 280 is the fastest card in the test, able to convincingly distance itself from AMD’s competition. It came in first place for five of the six test resolutions. Between the GeForce GTX 260 and GTX 280, the overall performance only shows a difference of 8.7%, which hardly warrants a cost of almost $150 more. The pressure from the competing Radeon HD 4870 pushed the initial price of the GTX 280 down from $649 to $420.

Although the GTX 280 at 11” (27 cm) is the same size as the GTX 260, and it uses higher clock rates, it isn’t much louder. In 2D mode, the temperature rises to 53 degrees Celsius (the GTX 260 goes to 49 degrees), but the fan only generates 37.7 dB(A) whereas the GTX 260 comes in at 38.1 dB(A). Problems with overly aggressive fan speeds in desktop mode do not occur. As long as the graphics chip (GPU) is cooled, the fan remains quiet. In 3D mode, the GTX 280 screams at 54.7 dB(A)— louder than the GTX 260. But it only hits an 85 degree Celsius maximum temperature (the GTX 260 reaches 105 degrees).

The GTX 280 clocks lower in 2D mode, which makes it even more economical than the HD 4850 from AMD. As soon as the GTX 280 comes out of 3D mode it switches to its low power 3D profile (GPU at 400 MHz, shaders at 800 MHz, and memory running at 600 MHz), which draws 130 watts of power (for the entire system). After a few seconds at idle, the clock rate is lowered into 2D mode (GPU at 300 MHz, shaders at 600 MHz, memory at 200 MHz), and overall consumption falls to 117 watts. Under full load, the GeForce GTX 280 consumed 352 watts. A branded power supply rated at 290 to 330 watts with 24 to 28 A on the 12 volt rail should be sufficient for a standard system.

The GTX 280 card is the superclocked model from MSI.MSI's bundle includes the Colin McRae Dirt racing game.

An internal SPDIF connection transfers sound to the HDMI adapter.Power delivery is handled by two PCIe connections, one with six pins and one with eight pins.

The SLI connections are hidden under a cover.Three graphics cards can be joined using two SLI connections.

The fan is two slots high, and exhaust air is expelled from the PC case.The card is almost 11” (27 cm) in length, and the two power connections are at the sides.

The GTX 280's circuitry is hidden under a cover that spans the whole card.The fan is located slightly off to the side, pushing warm exhaust air out of the case.

The I/O panel has one video and two DVI outputs.The VGA and HDMI adapters are supplied by MSI.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
    Top Comments
  • elbert
    Version AMD Catalyst 8.6? Why not just say i'm using ATI drivers with little to no optimizations for the 4800's. This is why the CF benchmarks tanked.
  • wahdangun
    WTF, hd4850 SHOULD be a lot faster than 9600 GT and 8800 GT even tough they have 1Gig of ram
  • mjam
    No 4870X2 and 1920 X 1200 max resolution tested. How about finishing the good start of an article with the rest of it...
  • Other Comments
  • San Pedro
    Looks like the results for SLI and Crossfire were switched with the single card results. . .
  • Duncan NZ
    Not a bad article, really comprehensive.
    My one complaint? Why use that CPU when you know that the test cards are going to max it out? Why not a quad core OC'ed to 4GHz? It'd give far more meaning to the SLI results. We don't want results that we can duplicate at home, we want results that show what these cards can do. Its a GPU card comparason, not a complain about not having a powerful enough CPU story.

    Oh? And please get a native english speaker to give it the once over for spelling and grammar errors, although this one had far less then many articles posted lately.
  • elbert
    No 4870x2 in CF so its the worlds top end Nvidia vs ATI mid to low end.
  • Lightnix
    It'd be a good article if you'd used a powerful enough CPU and up to date Radeon drivers (considering we're now up to 8.8 now), I mean are those even the 'hotfix' 8.6's or just the vanilla drivers?
  • elbert
    Version AMD Catalyst 8.6? Why not just say i'm using ATI drivers with little to no optimizations for the 4800's. This is why the CF benchmarks tanked.
  • Anonymous
    at 1280, all of the highend cards were CPU limited. at that resolution, you need a 3.2-3.4 c2d to feed a 3870... this article had so much potential, and yet... so much work, so much testing, fast for nothing, because most of the results are very cpu limited (except 1920@AA).
  • wahdangun
    WTF, hd4850 SHOULD be a lot faster than 9600 GT and 8800 GT even tough they have 1Gig of ram
  • mjam
    No 4870X2 and 1920 X 1200 max resolution tested. How about finishing the good start of an article with the rest of it...
  • Anonymous
    I agree, the 4870 X2 should have been in there and should have used the updated drivers. Good article but I think you fell short on finishing it.
  • Anonymous
    @pulasky - Rage much? It's called driver issues you dumbass. Some games are more optimised for multicard setups than others, and even then some favour SLi to Crossfire. And if you actually READ the article rather than let your shrinken libido get the better of you, you'll find that Crossfire does indeed work in CoD4.

    Remember, the more you know.
  • buzzlightbeer
    isnt forceware 177.41 out for gt200 series? so they are using a recent driver for the nvidia cards yet not for the ATI yes would have to agree with wahdangun the 4850 is alot faster then the 9600gt and the 8800gt i have 2 friends with both cards with q6600s one at 3.2 (9600gt) and the other at 3.0 (4850) and the 4850 machine destroys the other one even with a lower clocked cpu
    but yes the article was off to a great start, maybe throw some vantage in there as well?
  • chesterman
    agree with the others. u guys should use a more recent driver for ati/amd cards, use a more game-effective cpu and REALLY should have put the 4870x2 on the fight
  • masterwhitman
    elbertVersion AMD Catalyst 8.6? Why not just say i'm using ATI drivers with little to no optimizations for the 4800's. This is why the CF benchmarks tanked.

    Precisely; several other websites tested with 8.7 and 8.8 long before this article was published. Why couldn't you? Look at the 8.6 release notes; it doesn't even mention the HD4000 series cards as supported devices.

    Brilliant guys.
  • Anonymous
    and why use vista when noone that considers itself a gamer(even casual) touches with a ten-foot pole.
    This is another reason why the results are tanked, in XP you get 15% more performance compared to these values
  • roynaldi
    NVISION comes around and IRONicallY, a 36 page article is produced that is magically in favor of, whats that, NVIDIA!!!

    After having the Mythbusters appear, you would think this would be the most comprehensive, "scientific," factual, and update article meeting Tom's usual standards.... I didn't finish reading this.
  • xrodney
    Using old drivers with no optimalisation at all fo newest card whitch was released months ago seems too strange to me. Also temperature results for 48xx are quite oposite reality, at least when compare to 8.8 catalyst.
    (82 temperature in 2D 69 in 3D with no fanfix)
  • jitpublisher
    Pretty good, finally. Wish you would have have used an overclocked Quad so the newer GPU's could show their full potentianl, and you really should have used the latest drivers, but I give this article 2 thumbs up. Lot of good information in here.
  • Haiku214
    Well the main reason why they don't have the 4870x2 and the latest drivers is simply because they made this article a couple of weeks ago. If you could just imagine how long and tedious it is to produce all these data and results. It's just sad that after finally finishing the article, a lot of new stuff has already happened(new drivers and the x2).
  • jameskangster
    First I want to say that the article itself is not bad at all.
    Also, I can understand why TH didn't have time to use 8.8 since it was released publicly on August 20, 2008 (Although ATI would have gladly released a beta version to TH for testing purposes).

    However, AMD publicly released stable Catalyst 8.7(internal version 8.512) on July 21, 2008. That's more than a month ago. It has numerous improvements (for example, CF performance increase, improved stability and performance under Vista). To be honest, most of the improvements range from 4% to 15%. (In CF case, up to 1.7 X scaling)

    TH has rarely been unfair and/or inaccurate and they always owned up to their mistakes before, and I trust them to re-test ATI products with at least 8.7 if not 8.8 to continue to uphold their values and integrity.
  • outlw6669
    So, to start off with, this article is much better than many of the other recent reviews. I feel you put some thought into it and for the most part it is good. I found the comparative performance charts at the end interesting. Have you thought of changing the GPU charts in a similar fashion?

    Now on to my criticism.

    I can understand how you want to keep the results homogeneous with previous results but if you already know that a stock QX6800 will bottleneck the system, be proactive in fixing it. At the very least you should have done a small segment of the review showing the newer cards with a quad core overclocked to 4.0Ghz.

    Also, if you have ever read any of the older Toms articles, you would know that you can still minimise the bottleneck from a slow GPU bye raising the resolution. Perhaps you should test the fastest cards at the highest resolutions?

    I can also understand why you did not use the latest nVidia drivers. It takes time to create a review of this scale and the GF8/9 series drivers have been stable for some time. As the GT 200 series brings no new features to the table, they would needed little optimisation for their newer cards allowing the slightly dated drivers to perform nicely.

    What I can not understand is why you would use ATI's 8.6 drivers??
    The 8.7 drivers have been out for more than a month bringing quite a few fixes/optimisations with it. I understand it probably took more than 9 days to complete all of these benchmarks (today is the 29th, the 8.8 drivers were officially released on the 20th) but you should have called ATI and asked for their latest drivers. The 8.8 drivers were leaked at least a week before the official release which means, if you could nurture a relationship with the people you review, they could/probably would have provided them to you. There is still no excuse I can see for testing with the old 8.6 drivers. Seriously, it does not even have official support for the 48X0 cards...

    From the title of the article,"The Fastest 3D Cards Go Head-To-Head", I would have assumed that you would have been testing the Fastest 3D cards? What happened to your 4870x2? As you have already attempted to review it, we know you have your hands on one. How can you claim to review the "Fastest 3D Cards" and still leave out the fastest card?

    In summation, I liked many things from this article. The layout was nice and a little more technical than we have been seeing as of late. I enjoyed the comparison charts at the end and I think you should adopt a similar method for the CPU and GPU charts. I would have thought this was an excellent and well thought out article if it had not been for the glaring and obvious deficiencies in reason. I give you credit for stepping Toms in the right direction. With a little more unbiased comparison, critical thinking and common sense I could come to see reviews such as this in a very positive light.