Mainstream Graphics Card Roundup

Graphics Chips Compared Plus Test Configurations

We ran the following graphics cards on a PC with an Intel Core i7-920 CPU overclocked to 3.8 GHz (its default clock rate is 2.66 GHz), minimizing the possibility of a processor-based bottleneck.

Swipe to scroll horizontally
Nvidia Graphics Cards
Vendor and GPUInternal ReferenceRAMGPU ClockShaderMemory ClockSPs
GeForce GTX 2952 x GT200b2 x 896 MB GDDR3576 MHz4.0, 1,242 MHz2 x 999 MHz2 x 240
GeForce GTX 285GT200b1,024 MB GDDR3648 MHz4.0, 1,476 MHz2 x 1,242 MHz240
GeForce GTX 280GT2001,024 MB GDDR3602 MHz4.0, 1,296 MHz2 x 1,107 MHz240
GeForce GTX 275GT200b896 GDDR3633 MHz4.0, 1,404 MHz2 x 1,134 MHz240
Zotac GeForce GTX 260² (GTX 260 216 SPs)GT200b896 GDDR3576 MHz4.0, 1,242 MHz2 x 999 MHz216
GeForce GTX 260 216SPsGT200b896 GDDR3576 MHz4.0, 1,242 MHz2 x 999 MHz216
GeForce GTX 260GT200896 GDDR3576 MHz4.0, 1,242 MHz2 x 999 MHz192
Zotac GTS 250 1 GB (GTS 250)G92b1,024 MB GDDR3740 MHz4.0, 1,836 MHz2 x 1,100 MHz128
GeForce GTS 250G92b1,024 MB GDDR3740 MHz4.0, 1,836 MHz2 x 1,100 MHz128
GeForce 9800 GX22 x G922 x 512 MB GDDR3600 MHz4.0, 1,500 MHz2 x 1,000 MHz2 x 128
GeForce 9800 GTX+G92b512 MB GDDR3738 MHz4.0, 1,836 MHz2 x 1100 MHz128
GeForce 9800 GTXG92512 MB GDDR3675 MHz4.0, 1,688 MHz2 x 1,100 MHz128
GeForce 9600 GTG941,024 MB GDDR3650 MHz4.0, 1,625 MHz2 x 900 MHz64
GeForce 9600 GTG94512 MB GDDR3650 MHz4.0, 1,625 MHz2 x 900 MHz64
GeForce 8800 GTS 512G92512 MB GDDR3650 MHz4.0, 1,625 MHz2 x 972 MHz128
GeForce 8800 GTG921,024 MB GDDR3600 MHz4.0, 1,500 MHz2 x 900 MHz112
GeForce 8800 GTG92512 MB GDDR3600 MHz4.0, 1,500 MHz2 x 900 MHz112
GeForce 8800 UltraG80768 MB GDDR3612 MHz4.0, 1,512 MHz2 x 1,080 MHz128
GeForce 8800 GTXG80768 MB GDDR3576 MHz4.0, 1,350 MHz2 x 900 MHz128
GeForce 8800 GTSG80640 MB GDDR3513 MHz4.0, 1,188 MHz2 x 792 MHz96
GeForce 8800 GTSG80320 MB GDDR3513 MHz4.0, 1,188 MHz2 x 792 MHz96
Swipe to scroll horizontally
ATI Graphics Cards
Vendor and GPUCodenameRAMGPU ClockShaderMemory ClockSPs
Radeon HD 4890R7901,024 MB GDDR5850 MHz4.14 x 975 MHz800
Radeon HD 4870 X2R700 (2 x RV770)2 x 1,024 MB GDDR5750 MHz4.14 x 900 MHz2 x 800
HIS H487QT1GP ICEQ4+ (HD 4870)RV7701,024 MB GDDR5770 MHz4.14 x 1000 MHz800
Sapphire Vapor-X HD4870 2G (HD 4870)RV7702,048 MB GDDR5750 MHz4.14 x 900 MHz800
Radeon HD 4870RV770512 MB GDDR5750 MHz4.14 x 900 MHz800
Sapphire HD4850 1G (HD 4850)RV7701,024 MB GDDR3625 MHz4.12 x 993 MHz800
Radeon HD 4850RV770512 MB GDDR3625 MHz4.12 x 993 MHz800
Radeon HD 4770RV740512 MB GDDR5750 MHz4.14 x 800 MHz640
Radeon HD 4670RV730512 MB GDDR3750 MHz4.12 x 1,000 MHz320
Radeon HD 3870 X2R6802 x 512 MB GDDR3823 MHz4.12 x 900 MHz2 x 320
Radeon HD 3870RV670512 MB GDDR4776 MHz4.12 x 1,125 MHz320
Radeon HD 3850RV670256 MB GDDR3668 MHz4.12 x 829 MHz320

SPs=Stream Processors, R680=2xRV670, R700=2xRV770, Shader 2.0=DirectX 9.0, 3.0=DirectX 9.0c, 4.0=DirectX 10, Shader 4.1=DirectX 10.1


Swipe to scroll horizontally
Test System For Nvidia and ATI Graphics Cards
MotherboardAsus P6T, PCIe 2.0, ICH10R, 3-Way SLI
ChipsetIntel X58
MemoryCorsair, 3 x 2 GB DDR3, TR3X6G1600C8D, 2 x 570 MHz 8-8-8-20
AudioRealtek ALC1200
LANRealtek RTL8111C
HDDsSATA, Western Digital, Raptor WD300HLFS, WD5000AAKS
DVDGigabyte GO-D1600C
Power SupplyCoolerMaster RS-850-EMBA 850 W
Drivers and Configuration
GraphicsATI Catalyst 9.5, Nvidia GeForce 185.85 WHQL
OSWindows Vista Ultimate 32-Bit, SP1
DirectX9, 10, and 10.1
Chipset DriverIntel
  • ColMirage
    Great article! Good to see a large variery of old and new.
  • Why do you keep on including the last remnant test when it's obvious that there is a problem with the ati cards? Therefore the overall results are biased and it's unfair to ati and to the foes who jump directly to the conclusion.

    Also when you say *quote* "DirectX 10 crashed at 8x AA and the game and screen went black. Switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher" *unquote* for HAWX didn't you mean "ati cards were a lot faster that nvidia ones using DirectX 10 thanks to DirectX 10.1 and that was unacceptable. Hence the switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher for nvidia and ati is fcked again, close one guys".

    I am not an ati fanboy but I think TH has got its tongue sticked up a juicy green @ss.
  • cinergy
    Tino is putting again a big geforce ad. No mention of recent HUGE Radeon price cuts (eg. Radeon HD 4890 goes for $199.99 - 10$ mail rebate in newegg, and 4850 should go at 99$). And HAWX is again benchmarked without dx10.1 setting because of such crappy results for ATI. And not even a mention such technology exist in the game!
  • NuclearShadow
    I think its highly unfair that you would put The Last Remnant in as a benchmark. The game simply hates ATI cards and if you included that game when it came to making a conclusion then I think your intentionally being biased.

    Also I'm not sure why your holding the 260 as the best choice. The Zotac one you even picture is priced at $175 at newegg while the HIS 4850 1GB is like $115 at newegg. Sure the 260 outperforms it but when you take that price difference and look at the performance the 4850 1GB is certainly attractive. The 250 1GB lowest price on newegg is $140 and Zotac's costs $154.99 and if you compared the 4850 1GB to it using your own charts you would see the major killer of the sum of fps is largely effected by The Last Remnant.

    Speaking of Zotac I noticed that for some reason whenever they are mentioned they get a major ass kissing. While they make good products its clear that there is a bias here. You even picture the Zotac 260 and even gave it the ability to be selected on your own little comparison charts and look what you get when you compare it to normal 260 216sps,1173.html?prod=on&prod=on The exact same results and for some reason you deemed it necessary to list it individually as if it were special.

    Next time how about giving a real conclusion instead of a advertisement. Comparing a $115 card to a $175 and pushing Zotac down our throats makes it damn obvious what your doing.
  • scrumworks
    Wow! Tom's just cant let nvidia go. ATI clearly has price-performance advantage now. No Last Remnant benchs are gonna change it.
  • Summer Leigh Castle
    I'm not an expert but the article felt like Toms was trying too hard... just a "little" bias here.
  • d0gr0ck
    That's o
  • da bahstid
    Tino must have missed that whole thing a few months back where educated readers decided they weren't going to tolerate such ridiculously biased conclusions. There's only a $5 difference in price between the His 4870 and Zotac 260 on Newegg as I write this (nothing like Tino's claim that the Zotac has a >$35 advantage), and the performance of the two came within 0.5% of each other...DESPITE two extremely pro-NVidia slanted tests (Last Remnant and Hawx).

    I actually encourage keeping the Last Remnant test because ATI shouldn't get breaks for poor drivers (or inadequate collaboration with developers), but by that same token if NVidia loses out on lack of 10.1 support that result absolutely needs to be included. TH was actually starting to look credible again, it must have taken you guys months of seriously attentive work and comprehensive benchmarking to regain that...what in the world are you guys thinking starting up this tripe again?
  • Ramar
    I agree, the last remnant is stupid. Everyone with an ATI card knows they probably can't play it. Funny, considering it was developed for a console with an ATI chip.

    Let's do some simple math here to prove if ATI really has the price to performance advantage.

    I'll use far cry 2 because I think it's a very fair description of DirectX10 power, "WIMTBP" be damned.

    Top range, GTX 295 vs 4870X2, there's a performance difference on par with their respective prices, especially in the highest res and AA/AF setting.

    Higher-mid, 4890 vs GTX 275. Again, the performance percentage is very close to the twenty dollar difference between cards, and exceeded in nvidia's side at the highest res.

    High-mid, 4870 vs GTX 260 216, even on Left 4 Dead, a source engine game favoring ati, the 260 comes out on par. This is a tie, really. But don't kid yourself into thinking the 4850 is any kind of match for the 260.

    Mid-range, 4850 vs...well, if you take a GTS 250, they're very evenly matched. If you REALLY want a 9800GT for the same price, well, sucks to have an IQ of 50.

    Also factor in the growing use of Physx and ATI doesn't make a very compelling argument. Prices are matched frustratingly well and the only real "killer deal" is a 4850x2 for slightly over $200.

    Know that I like ATI and I'm not saying they're bad cards, I'm just saying they're only on par with Nvidia's offerings, not above them.

    Just remember that DirectX11 will be in full swing in around six months and none of this will matter anyway.
  • d0gr0ck
    I don't know what you did to get those TLR benches on the ATI cards. On a single HD 4870 (512MB reference style) card I easily got the playable 60fps at 1680x1050 at high settings + medium shadows. Once I upgraded to crossfire the framerate blows past 60fps on all high settings. Slowdown only occurs when the game effects churn out an inordinate amount of lighting/shadow effects. I use an old X38/E8500 combo to game on, so by all means you should be getting better results than I do.