Sign in with
Sign up | Sign in

Graphics Chips Compared Plus Test Configurations

Mainstream Graphics Card Roundup
By

We ran the following graphics cards on a PC with an Intel Core i7-920 CPU overclocked to 3.8 GHz (its default clock rate is 2.66 GHz), minimizing the possibility of a processor-based bottleneck.

Nvidia Graphics Cards

Vendor and GPU

Internal Reference

RAM

GPU Clock

Shader

Memory Clock

SPs

GeForce GTX 295

2 x GT200b

2 x 896 MB GDDR3

576 MHz

4.0, 1,242 MHz

2 x 999 MHz

2 x 240

GeForce GTX 285

GT200b

1,024 MB GDDR3

648 MHz

4.0, 1,476 MHz

2 x 1,242 MHz

240

GeForce GTX 280

GT200

1,024 MB GDDR3

602 MHz

4.0, 1,296 MHz

2 x 1,107 MHz

240

GeForce GTX 275

GT200b

896 GDDR3

633 MHz

4.0, 1,404 MHz

2 x 1,134 MHz

240

Zotac GeForce GTX 260² (GTX 260 216 SPs)

GT200b

896 GDDR3

576 MHz

4.0, 1,242 MHz

2 x 999 MHz

216

GeForce GTX 260 216SPs

GT200b

896 GDDR3

576 MHz

4.0, 1,242 MHz

2 x 999 MHz

216

GeForce GTX 260

GT200

896 GDDR3

576 MHz

4.0, 1,242 MHz

2 x 999 MHz

192

Zotac GTS 250 1 GB (GTS 250)

G92b

1,024 MB GDDR3

740 MHz

4.0, 1,836 MHz

2 x 1,100 MHz

128

GeForce GTS 250

G92b

1,024 MB GDDR3

740 MHz

4.0, 1,836 MHz

2 x 1,100 MHz

128

GeForce 9800 GX2

2 x G92

2 x 512 MB GDDR3

600 MHz

4.0, 1,500 MHz

2 x 1,000 MHz

2 x 128

GeForce 9800 GTX+

G92b

512 MB GDDR3

738 MHz

4.0, 1,836 MHz

2 x 1100 MHz

128

GeForce 9800 GTX

G92

512 MB GDDR3

675 MHz

4.0, 1,688 MHz

2 x 1,100 MHz

128

GeForce 9600 GT

G94

1,024 MB GDDR3

650 MHz

4.0, 1,625 MHz

2 x 900 MHz

64

GeForce 9600 GT

G94

512 MB GDDR3

650 MHz

4.0, 1,625 MHz

2 x 900 MHz

64

GeForce 8800 GTS 512

G92

512 MB GDDR3

650 MHz

4.0, 1,625 MHz

2 x 972 MHz

128

GeForce 8800 GT

G92

1,024 MB GDDR3

600 MHz

4.0, 1,500 MHz

2 x 900 MHz

112

GeForce 8800 GT

G92

512 MB GDDR3

600 MHz

4.0, 1,500 MHz

2 x 900 MHz

112

GeForce 8800 Ultra

G80

768 MB GDDR3

612 MHz

4.0, 1,512 MHz

2 x 1,080 MHz

128

GeForce 8800 GTX

G80

768 MB GDDR3

576 MHz

4.0, 1,350 MHz

2 x 900 MHz

128

GeForce 8800 GTS

G80

640 MB GDDR3

513 MHz

4.0, 1,188 MHz

2 x 792 MHz

96

GeForce 8800 GTS

G80

320 MB GDDR3

513 MHz

4.0, 1,188 MHz

2 x 792 MHz

96

ATI Graphics Cards

Vendor and GPU

Codename

RAM

GPU Clock

Shader

Memory Clock

SPs

Radeon HD 4890

R790

1,024 MB GDDR5

850 MHz

4.1

4 x 975 MHz

800

Radeon HD 4870 X2

R700 (2 x RV770)

2 x 1,024 MB GDDR5

750 MHz

4.1

4 x 900 MHz

2 x 800

HIS H487QT1GP ICEQ4+ (HD 4870)

RV770

1,024 MB GDDR5

770 MHz

4.1

4 x 1000 MHz

800

Sapphire Vapor-X HD4870 2G (HD 4870)

RV770

2,048 MB GDDR5

750 MHz

4.1

4 x 900 MHz

800

Radeon HD 4870

RV770

512 MB GDDR5

750 MHz

4.1

4 x 900 MHz

800

Sapphire HD4850 1G (HD 4850)

RV770

1,024 MB GDDR3

625 MHz

4.1

2 x 993 MHz

800

Radeon HD 4850

RV770

512 MB GDDR3

625 MHz

4.1

2 x 993 MHz

800

Radeon HD 4770

RV740

512 MB GDDR5

750 MHz

4.1

4 x 800 MHz

640

Radeon HD 4670

RV730

512 MB GDDR3

750 MHz

4.1

2 x 1,000 MHz

320

Radeon HD 3870 X2

R680

2 x 512 MB GDDR3

823 MHz

4.1

2 x 900 MHz

2 x 320

Radeon HD 3870

RV670

512 MB GDDR4

776 MHz

4.1

2 x 1,125 MHz

320

Radeon HD 3850

RV670

256 MB GDDR3

668 MHz

4.1

2 x 829 MHz

320

SPs=Stream Processors, R680=2xRV670, R700=2xRV770, Shader 2.0=DirectX 9.0, 3.0=DirectX 9.0c, 4.0=DirectX 10, Shader 4.1=DirectX 10.1

 

Test System For Nvidia and ATI Graphics Cards

Motherboard

Asus P6T, PCIe 2.0, ICH10R, 3-Way SLI

Chipset

Intel X58

Memory

Corsair, 3 x 2 GB DDR3, TR3X6G1600C8D, 2 x 570 MHz 8-8-8-20

Audio

Realtek ALC1200

LAN

Realtek RTL8111C

HDDs

SATA, Western Digital, Raptor WD300HLFS, WD5000AAKS

DVD

Gigabyte GO-D1600C

Power Supply

CoolerMaster RS-850-EMBA 850 W

Drivers and Configuration

Graphics

ATI Catalyst 9.5, Nvidia GeForce 185.85 WHQL

OS

Windows Vista Ultimate 32-Bit, SP1

DirectX

9, 10, and 10.1

Chipset Driver

Intel 9.1.0.1007

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 115 comments.
This thread is closed for comments
Top Comments
  • 35 Hide
    NuclearShadow , June 17, 2009 7:12 AM
    I think its highly unfair that you would put The Last Remnant in as a benchmark. The game simply hates ATI cards and if you included that game when it came to making a conclusion then I think your intentionally being biased.

    Also I'm not sure why your holding the 260 as the best choice. The Zotac one you even picture is priced at $175 at newegg while the HIS 4850 1GB is like $115 at newegg. Sure the 260 outperforms it but when you take that price difference and look at the performance the 4850 1GB is certainly attractive. The 250 1GB lowest price on newegg is $140 and Zotac's costs $154.99 and if you compared the 4850 1GB to it using your own charts you would see the major killer of the sum of fps is largely effected by The Last Remnant.

    Speaking of Zotac I noticed that for some reason whenever they are mentioned they get a major ass kissing. While they make good products its clear that there is a bias here. You even picture the Zotac 260 and even gave it the ability to be selected on your own little comparison charts and look what you get when you compare it to normal 260 216sps http://www.tomshardware.com/charts/gaming-graphics-charts-2009/compare,1173.html?prod[2447]=on&prod[2463]=on The exact same results and for some reason you deemed it necessary to list it individually as if it were special.

    Next time how about giving a real conclusion instead of a advertisement. Comparing a $115 card to a $175 and pushing Zotac down our throats makes it damn obvious what your doing.
  • 34 Hide
    Anonymous , June 17, 2009 7:01 AM
    Why do you keep on including the last remnant test when it's obvious that there is a problem with the ati cards? Therefore the overall results are biased and it's unfair to ati and to the foes who jump directly to the conclusion.

    Also when you say *quote* "DirectX 10 crashed at 8x AA and the game and screen went black. Switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher" *unquote* for HAWX didn't you mean "ati cards were a lot faster that nvidia ones using DirectX 10 thanks to DirectX 10.1 and that was unacceptable. Hence the switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher for nvidia and ati is fcked again, close one guys".

    I am not an ati fanboy but I think TH has got its tongue sticked up a juicy green @ss.
  • 28 Hide
    scrumworks , June 17, 2009 7:21 AM
    Wow! Tom's just cant let nvidia go. ATI clearly has price-performance advantage now. No Last Remnant benchs are gonna change it.
Other Comments
  • 0 Hide
    ColMirage , June 17, 2009 7:00 AM
    Great article! Good to see a large variery of old and new.
  • 34 Hide
    Anonymous , June 17, 2009 7:01 AM
    Why do you keep on including the last remnant test when it's obvious that there is a problem with the ati cards? Therefore the overall results are biased and it's unfair to ati and to the foes who jump directly to the conclusion.

    Also when you say *quote* "DirectX 10 crashed at 8x AA and the game and screen went black. Switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher" *unquote* for HAWX didn't you mean "ati cards were a lot faster that nvidia ones using DirectX 10 thanks to DirectX 10.1 and that was unacceptable. Hence the switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher for nvidia and ati is fcked again, close one guys".

    I am not an ati fanboy but I think TH has got its tongue sticked up a juicy green @ss.
  • 28 Hide
    cinergy , June 17, 2009 7:07 AM
    Tino is putting again a big geforce ad. No mention of recent HUGE Radeon price cuts (eg. Radeon HD 4890 goes for $199.99 - 10$ mail rebate in newegg, and 4850 should go at 99$). And HAWX is again benchmarked without dx10.1 setting because of such crappy results for ATI. And not even a mention such technology exist in the game!
  • 35 Hide
    NuclearShadow , June 17, 2009 7:12 AM
    I think its highly unfair that you would put The Last Remnant in as a benchmark. The game simply hates ATI cards and if you included that game when it came to making a conclusion then I think your intentionally being biased.

    Also I'm not sure why your holding the 260 as the best choice. The Zotac one you even picture is priced at $175 at newegg while the HIS 4850 1GB is like $115 at newegg. Sure the 260 outperforms it but when you take that price difference and look at the performance the 4850 1GB is certainly attractive. The 250 1GB lowest price on newegg is $140 and Zotac's costs $154.99 and if you compared the 4850 1GB to it using your own charts you would see the major killer of the sum of fps is largely effected by The Last Remnant.

    Speaking of Zotac I noticed that for some reason whenever they are mentioned they get a major ass kissing. While they make good products its clear that there is a bias here. You even picture the Zotac 260 and even gave it the ability to be selected on your own little comparison charts and look what you get when you compare it to normal 260 216sps http://www.tomshardware.com/charts/gaming-graphics-charts-2009/compare,1173.html?prod[2447]=on&prod[2463]=on The exact same results and for some reason you deemed it necessary to list it individually as if it were special.

    Next time how about giving a real conclusion instead of a advertisement. Comparing a $115 card to a $175 and pushing Zotac down our throats makes it damn obvious what your doing.
  • 28 Hide
    scrumworks , June 17, 2009 7:21 AM
    Wow! Tom's just cant let nvidia go. ATI clearly has price-performance advantage now. No Last Remnant benchs are gonna change it.
  • 21 Hide
    Summer Leigh Castle , June 17, 2009 8:05 AM
    I'm not an expert but the article felt like Toms was trying too hard... just a "little" bias here.
  • 27 Hide
    da bahstid , June 17, 2009 9:09 AM
    Tino must have missed that whole thing a few months back where educated readers decided they weren't going to tolerate such ridiculously biased conclusions. There's only a $5 difference in price between the His 4870 and Zotac 260 on Newegg as I write this (nothing like Tino's claim that the Zotac has a >$35 advantage), and the performance of the two came within 0.5% of each other...DESPITE two extremely pro-NVidia slanted tests (Last Remnant and Hawx).

    I actually encourage keeping the Last Remnant test because ATI shouldn't get breaks for poor drivers (or inadequate collaboration with developers), but by that same token if NVidia loses out on lack of 10.1 support that result absolutely needs to be included. TH was actually starting to look credible again, it must have taken you guys months of seriously attentive work and comprehensive benchmarking to regain that...what in the world are you guys thinking starting up this tripe again?
  • 3 Hide
    Ramar , June 17, 2009 9:11 AM
    I agree, the last remnant is stupid. Everyone with an ATI card knows they probably can't play it. Funny, considering it was developed for a console with an ATI chip.

    Let's do some simple math here to prove if ATI really has the price to performance advantage.

    I'll use far cry 2 because I think it's a very fair description of DirectX10 power, "WIMTBP" be damned.

    Top range, GTX 295 vs 4870X2, there's a performance difference on par with their respective prices, especially in the highest res and AA/AF setting.

    Higher-mid, 4890 vs GTX 275. Again, the performance percentage is very close to the twenty dollar difference between cards, and exceeded in nvidia's side at the highest res.

    High-mid, 4870 vs GTX 260 216, even on Left 4 Dead, a source engine game favoring ati, the 260 comes out on par. This is a tie, really. But don't kid yourself into thinking the 4850 is any kind of match for the 260.

    Mid-range, 4850 vs...well, if you take a GTS 250, they're very evenly matched. If you REALLY want a 9800GT for the same price, well, sucks to have an IQ of 50.

    Also factor in the growing use of Physx and ATI doesn't make a very compelling argument. Prices are matched frustratingly well and the only real "killer deal" is a 4850x2 for slightly over $200.

    Know that I like ATI and I'm not saying they're bad cards, I'm just saying they're only on par with Nvidia's offerings, not above them.

    Just remember that DirectX11 will be in full swing in around six months and none of this will matter anyway.
  • 11 Hide
    d0gr0ck , June 17, 2009 9:12 AM
    I don't know what you did to get those TLR benches on the ATI cards. On a single HD 4870 (512MB reference style) card I easily got the playable 60fps at 1680x1050 at high settings + medium shadows. Once I upgraded to crossfire the framerate blows past 60fps on all high settings. Slowdown only occurs when the game effects churn out an inordinate amount of lighting/shadow effects. I use an old X38/E8500 combo to game on, so by all means you should be getting better results than I do.
  • 23 Hide
    JeanLuc , June 17, 2009 9:38 AM
    Who in all honesty plays "The Last Remnant"? It's hardly the biggest and best PC game on the market right now it just makes me wonder why Toms Hardware would pick such a obscure console port to benchmark video cards on?

    Once again TH tests HAWX on DX10 not on DX10.1. Can someone please explain why you wouldn't test a game using all the available technology? The excuse 'it's not fair to compare a video card that can only utilise DX10 against video cards that can uses 10.1' is unelectable. If you own an ATI card or plan on buying one you should make it clear that the Radeon card has the advantage as anyone who has DX10 can get DX10.1.
  • -2 Hide
    d0gr0ck , June 17, 2009 10:20 AM
    JeanLucIt's hardly the biggest and best PC game on the market right now it just makes me wonder why Toms Hardware would pick such a obscure console port to benchmark video cards on?


    It is one of the more recent games to use the Unreal 3 engine on the PC. So it's probalby serving the place of what used to be held by Unreal Tournament 3 in the benches. Just my guess.

  • 16 Hide
    chyll2 , June 17, 2009 10:24 AM
    another nvidia ad.
  • 14 Hide
    neiroatopelcc , June 17, 2009 10:29 AM
    page 15 sais "Current Nvidia graphics chips usually produce a CPU score between 6300 and 6400, whereas Nvidia chips come in around 6600" A bit much nvidia here perhaps! might want to make that last one an ati name
  • 15 Hide
    gkay09 , June 17, 2009 10:29 AM
    Have been reading good article here but this one certainly is a let down...
    We all know for the fact that even Nvidia has acknowledged that DirectX 10.1 has lot more to offer than the DirectX 10 and the newer Nvidia mobile GPUs from nvidia are DirectX 10.1 compliant...
    http://www.xbitlabs.com/news/video/display/20090616133250_Nvidia_s_New_Graphics_Chips_Support_DirectX_10_1.html

    And there are XFX 4870 1GB cards for $150 on newegg right now and their performance is very much comparable with the GTX 260...So for about $150 would you still suggest the GTS 250 1GB ?
  • 8 Hide
    Anonymous , June 17, 2009 10:33 AM
    Bias Bias Bias Bias Bias Bias Bias Bias Bias Bias Bias Bias .........
  • 13 Hide
    rhys216 , June 17, 2009 11:24 AM
    Lol! I'm glad it's not just me!
    I'm hoping Tino is simply biased and unable to put together an honest comparison, rather than Tom's hardware being in the pocket of Intel & Nvidea!
    All I know is something stinks, as you don't get it this wrong by accident.
  • 18 Hide
    rhys216 , June 17, 2009 11:31 AM
    @ Ramar!
    DX10.1 was originally going to be DX10.
    Nvidea could cut the mustard in time and M$ watered down DX10 to help out Nvidea!
    It had nothing to do with Nvidea being smart, in fact it's the complete opposite Lol!

    Perhaps you haven't had enough sleep? how about you go back to bed and cosy on up with TIno some more?
  • 5 Hide
    neiroatopelcc , June 17, 2009 11:41 AM
    RamarDirectX 10.1 makes an almost unnoticeable difference and Nvidia is simply smart enough to wait for DX11. It's only on those mobile processors because it's a selling point in a very hostile environment.I just checked newegg and the difference between buying the cheapest 4870 1GB and the cheapest GTX 260 216 is 10 dollars, with a 10 dollar discount on the 4870. They are almost identically performing cards, but I'd personally rather spend the extra ten dollars on Physx than DX10.1.

    I've got a 4870 and I can honestly say 10.1 makes NO DIFFERENCE in the real world. But then physx doesn't either (unless you play that mirrors edge thing - which is EA, thus you don't). Buy what is cheapest.

    rhys216Lol! I'm glad it's not just me!I'm hoping Tino is simply biased and unable to put together an honest comparison, rather than Tom's hardware being in the pocket of Intel & Nvidea!All I know is something stinks, as you don't get it this wrong by accident.

    I don't nessecarily agree with you. They did include one game that runs exceptionally poorly on ati hardware, and perhaps should've excluded it from the combined results, but apart from that it might well be honest.
    I believe Tino is from the german office right? And according to idealo.de (german price comparison database) the cheapeast reasonable 1gb card is an 4830 @ 87€. The cheapest gtx 260 costs 2€ less than a 4870 1gb and performs slightly better on average. It's not like the 4870 is really poor or anything, but the 260 simply is a bit better.
    That said I've just recommended a collegue of mine to buy a 4670 for her kids computer and expect to buy 2 4770's for my parents system.
    Anyway. Why do you accuse them of being paid by intel ? intel wasn't even mentioned except for what system they were testing on. And given x58 is THE ONLY choice for CF and SLI at once, that's a nobrainer.
Display more comments