Sign in with
Sign up | Sign in

Graphics Chips Compared And Test Configuration

Roundup: Mainstream Graphics Cards From ATI And Nvidia
By

We used an Intel Core i7-920 CPU overclocked to 3.8 GHz for this round of graphics card testing.

Nvidia Graphics Cards


Codename

Graphics RAM   

GPU Clock

Shader

Memory clock

SPs

GeForce GTX 275

GT200b

896 GDDR3

633 MHz

4.0, 1,404 MHz

2 x 1,134 MHz

240

GeForce GTX 260

GT200b

896 GDDR3

576 MHz

4.0, 1,242 MHz

2 x 999 MHz

216

GeForce GTX 260

GT200

896 GDDR3

576 MHz

4.0, 1,242 MHz

2 x 999 MHz

192

GeForce GTS 250

G92b

1,024 MB GDDR3

740 MHz

4.0, 1,836 MHz

2 x 1,100 MHz

128

GeForce 9800 GTX+

G92b

512 MB GDDR3

738 MHz

4.0, 1,836 MHz

2 x 1,100 MHz

128

GeForce 9800 GTX

G92

512 MB GDDR3

675 MHz

4.0, 1,688 MHz

2 x 1,100 MHz

128

Zotac GeForce 9800GT 512MB (9800 GT)

G92b

512 MB GDDR3

660 MHz

4.0, 1,600 MHz

2 x 900 MHz

112

GeForce 9800 GT

G92b

512 MB GDDR3

600 MHz

4.0, 1,512 MHz

2 x 900 MHz

112

GeForce 9600 GT

G94

1,024 MB GDDR3

650 MHz

4.0, 1,625 MHz

2 x 900 MHz

64

GeForce 9600 GT

G94

512 MB GDDR3

650 MHz

4.0, 1,625 MHz

2 x 900 MHz

64

GeForce 8800 GTS 512

G92

512 MB GDDR3

650 MHz

4.0, 1,625 MHz

2 x 972 MHz

128

GeForce 8800 GT

G92

1,024 MB GDDR3

600 MHz

4.0, 1,500 MHz

2 x 900 MHz

112

GeForce 8800 GT

G92

512 MB GDDR3

600 MHz

4.0, 1,500 MHz

2 x 900 MHz

112

GeForce 8800 Ultra

G80

768 MB GDDR3

612 MHz

4.0, 1,512 MHz

2 x 1,080 MHz

128

GeForce 8800 GTX

G80

768 MB GDDR3

576 MHz

4.0, 1,350 MHz

2 x 900 MHz

128

GeForce 8800 GTS

G80

640 MB GDDR3

513 MHz

4.0, 1,188 MHz

2 x 792 MHz

96

GeForce 8800 GTS

G80

320 MB GDDR3

513 MHz

4.0, 1,188 MHz

2 x 792 MHz

96

GeForce 8600 GTS

G84

512 MB GDDR3

675 MHz

4.0, 1,450 MHz

2 x 1,008 MHz

32

ATI Graphics Cards


Codename

Graphics RAM

GPU Clock

Shader

Memory clock

SPs

Radeon HD 4890

R790

1,024 MB GDDR5

850 MHz

4.1

4 x 975 MHz

800

Radeon HD 4870

RV770

512 MB GDDR5

750 MHz

4.1

4 x 900 MHz

800

Radeon HD 4850

RV770

512 MB GDDR3

625 MHz

4.1

2 x 993 MHz

800

Sapphire HD 4830 512 MB (HD 4830)

RV770

512 MB GDDR3

575 MHz

4.1

2 x 900 MHz

640

Radeon HD 4830

RV770

512 MB GDDR3

575 MHz

4.1

2 x 900 MHz

640

Sapphire HD 4770 512 MB (HD 4770)

RV740

512 MB GDDR5

750 MHz

4.1

4 x 800 MHz

640

Radeon HD 4770

RV740

512 MB GDDR5

750 MHz

4.1

4 x 800 MHz

640

HIS H467QT512P (HD 4670) CF

RV730

512 MB GDDR3

780 MHz

4.1

2 x 1,000 MHz

320

Radeon HD 4670 CF

RV730

512 MB GDDR3

750 MHz

4.1

2 x 1,000 MHz

320

HIS H467QT512P (HD 4670)

RV730

512 MB GDDR3

780 MHz

4.1

2 x 1,000 MHz

320

HIS H467PS1GP (HD 4670)

RV730

1,024 MB GDDR3

750 MHz

4.1

2 x 850 MHz

320

Radeon HD 4670

RV730

512 MB GDDR3

750 MHz

4.1

2 x 1,000 MHz

320

Radeon HD 3870

RV670

512 MB GDDR4

776 MHz

4.1

2 x 1,125 MHz

320

Radeon HD 3850

RV670

256 MB GDDR3

668 MHz

4.1

2 x 829 MHz

320

Memory Clock = DDR clock rate x 2, DDR5 clock rate x 4, SPs = Stream Processors, CF = CrossFire parallel configuration with two ATI cards, Shader 2.0 = DirectX 9.0, 3.0 = DirectX 9.0c, 4.0 = DirectX 10, 4.1 = DirectX 10.1

 

Test PC Configuration

CPU

Intel Core i7-920 @ 3.8 GHz (20 x 190), BIOS 1.2625 Volt, 45 nm, Socket 1366 LGA

Motherboard

Asus P6T, PCIe 2.0, ICH10R, 3-Way SLI

Chipset

Intel X58

Memory

Corsair, 3 x 2 GB DDR3, TR3X6G1600C8D, 2x570 MHz 8-8-8-20

Audio

Realtek ALC1200

LAN

Realtek RTL8111C

HDDs

SATA, Western Digital, Raptor WD300HLFS, WD5000AAKS

DVD

Gigabyte GO-D1600C

Power Supply

Cooler Master RS-850-EMBA 850 Watt

Drivers and Settings

Graphics

ATI Catalyst 9.6, Nvidia GeForce 186.18

OS

Windows Vista Ultimate 32 Bit, SP1

DirectX

9, 10, and 10.1

Chipset Driver

Intel 9.1.0.1007

Display all 74 comments.
This thread is closed for comments
  • 1 Hide
    Bloodblender , August 10, 2009 6:06 AM
    All I can say is that Tom's recent articles have been an excellent read, and this exactly the stuff I (as well as many others) require for their research purposes. Keep up the great work!
  • 3 Hide
    dirtmountain , August 10, 2009 6:21 AM
    Nice article,very well done, but you need to show the 4670 in CF as costing $162, not $81 as shown in the final chart.
  • -4 Hide
    rambo117 , August 10, 2009 6:55 AM
    the iceQ concept is amazing. keeps my 3870s nice and chilly (70C) while hardcore gaming
    and not to mention they both look intimidating in my case ;) 
  • 0 Hide
    pij , August 10, 2009 6:58 AM
    Quick question -

    4770 in crossfire or single 4890 best bet???..
  • 1 Hide
    Anonymous , August 10, 2009 7:01 AM
    to me the gaming benches are most important but energy efficiency and heat dissipation run a close 2nd. thanks for providing it all!
  • 8 Hide
    Julianbreaker , August 10, 2009 7:36 AM
    Newegg has quite a few 4850s that retail for $100 and it appears to be getting consistently better benchmarks than the 4770. I am confused as to why you would not recommend it over the 4770. Perhaps you are confused by simple maths.
  • -4 Hide
    radiowars , August 10, 2009 8:09 AM
    PijQuick question - 4770 in crossfire or single 4890 best bet???..

    They already did a whole article on that...
  • 8 Hide
    bucifer , August 10, 2009 8:35 AM
    I don't understand why you still won't use the 1GB version of the Radeon 4870. It's clear to me that the card is limited by it's amount of video memory when using hi-res, AA and AF.
    Searching for prices in US and Europe it retails cheaper than the GTX260(192 or 216).
    The point is: the card should be included in the test just as the GTX260-216. It's clearly a better option than the 512 mb version and it's good for comparison!
  • -2 Hide
    masterjaw , August 10, 2009 8:52 AM
    Nice article here. Most importantly, no unnecessary bias included.
  • 8 Hide
    holodust , August 10, 2009 8:57 AM
    Nice article, but I don't see how testing these cards on i7 920@3.8 fits into mainstream.
  • 5 Hide
    Hamsterabed , August 10, 2009 9:08 AM
    makes it a control to make sure they are only ratting the graphics cards and not the cpu. makes sure the GPUS are the limiting factor
  • 0 Hide
    qwertymac93 , August 10, 2009 9:18 AM
    something is bothering me. i have left 4 dead, and when i play it at 8xAA, 16xAF, i get higher frame rates then you do(close to 100). i have a 4830 and the res i play at is 1440x900. i know its not the same as 1650x1050, but the extra AA and AF should at least keep them close, but i get over 20fps more then your 4850! i don't have a fancy i7 and still. my 4830 is clocked at 700/1000. did you set the aa/af in the drivers or in-game, because in-game is almost always better.
  • 0 Hide
    Sihastru , August 10, 2009 9:52 AM
    qwertymac93, probably your driver settings (optimizations) are overriding the in game engine settings.
  • 0 Hide
    pij , August 10, 2009 10:04 AM
    buciferI don't understand why you still won't use the 1GB version of the Radeon 4870. It's clear to me that the card is limited by it's amount of video memory when using hi-res, AA and AF. Searching for prices in US and Europe it retails cheaper than the GTX260(192 or 216).The point is: the card should be included in the test just as the GTX260-216. It's clearly a better option than the 512 mb version and it's good for comparison!



    Sorry I thought the article was 'mainstream graphics cards' not 4890 vs 2x4770's in crossfire ! Blimey I must be going mad.
  • -2 Hide
    pij , August 10, 2009 10:05 AM
    whoops wrong quote - how silly of me.
  • 1 Hide
    pij , August 10, 2009 10:07 AM

    They already did a whole article on that...


    Sorry I thought the article was 'mainstream graphics cards' not 4890 vs 2x4770's in crossfire ! Blimey I must be going mad.

  • 3 Hide
    amnotanoobie , August 10, 2009 10:12 AM
    JulianbreakerNewegg has quite a few 4850s that retail for $100 and it appears to be getting consistently better benchmarks than the 4770. I am confused as to why you would not recommend it over the 4770. Perhaps you are confused by simple maths.


    The only reason that I could think of wherein the 4770 is better, is the smaller manufacturing process which should make it cooler and consume less power. Though if raw performance is your concern, the 4850 may be better.

    holodustNice article, but I don't see how testing these cards on i7 920@3.8 fits into mainstream.


    They usually do it on the highest rig they have to eliminate as much possible bottlenecks as possible. I think they were just making sure that video card's respective scores do not flat-out (i.e. the GTX 275, 260, 4870 and 4890 displaying the same scores when they are clearly a bit different hardware). Ideally even on lower-end hardware this chart should still show the same order in terms of performace, though you'd probably lose a few fps.
  • -2 Hide
    qwertymac93 , August 10, 2009 10:16 AM
    gee, a thumbs down for asking a question, interesting crowd huh. no, all my driver settings are set to "application settings: and i can visually confirm that the AA is indeed working. i think its just a different level. the frame rates i was quoting was from the rooftop part.
  • -2 Hide
    haplo602 , August 10, 2009 10:20 AM
    amnotanoobieThey usually do it on the highest rig they have to eliminate as much possible bottlenecks as possible. I think they were just making sure that video card's respective scores do not flat-out (i.e. the GTX 275, 260, 4870 and 4890 displaying the same scores when they are clearly a bit different hardware). Ideally even on lower-end hardware this chart should still show the same order in terms of performace, though you'd probably lose a few fps.


    well then it begs the question, which card is more platform limited. I mean the driver may scale differently with CPU power, so the card winning on the overclocked i7 may actualy be the worst on a stock PII X3 720 BE, or X2 550 BE.

    thus testing mainstream GPUs on high-end platforms has a flaw here ...
  • 3 Hide
    bucifer , August 10, 2009 10:38 AM
    Pij your IQ is below the sea level.
    masterjawNice article here. Most importantly, no unnecessary bias included.

    As for this statement I have one OBVIOUS mention. Why did they use The Last Remnant for testing again?
Display more comments