Roundup: Mainstream Graphics Cards From ATI And Nvidia

Zotac GeForce 9800 GT 512 MB (GeForce 9800 GT, 512 MB)

To see all the photos in our gallery for this card, click the image below.

ATI and Nvidia are both very aware of where their respective strengths lie, so it's almost humorous that, for a story like this one, it's difficult to get access to many of Nvidia's mainstream offerings. The interest is usually in the company's more powerful products, which sell for more than $150. If you were wondering why there are more ATI cards in this roundup, now you know.

Thanks to Zotac, we were able to procure one GeForce-based test card that comes in around $100 and doesn't have to be shielded from faster competition. This tweaked version of the GeForce 9800 GT is able to keep up with ATI's Radeon HD 4770. Its overall performance lags only 1.1% behind the 40 nm Radeon HD 4770, leaving the Radeon HD 4670 and Radeon HD 4830 far behind.

Zotac does not adhere to reference clock rates, and thereby affords its GeForce 9800 GT some extra muscle. Its GPU clocks in at 660 MHz instead of 600 MHz, and the shaders run at 1,600 MHz instead of 1,512 MHz. The memory clock rate remains unaltered at 900 MHz. This card supports DirectX 10 and comes equipped with 512 MB of GDDR3 RAM. The graphics chip is labeled as G92b, is based on 55 nm process technology, and is a direct successor to the GeForce 8800 GT.

The fan is fairly narrow, enabling this card to fit in a single expansion slot. For an Nvidia card with a low-profile cooler, its temperature readings are typical: 58° C for 2D, and 87° C for 3D under heavy load. By comparison to competing cards, the Zotac GeForce 9800 GT’s power-consumption levels are relatively high, in part because it doesn’t reduce clock speeds in 2D mode—the card consumed 173 W in 2D mode and 292 W in heavy 3D mode. Noise levels measure at a quiet 36.3 dB(A) in 2D mode, but climb to a clearly audible 46.8 dB(A) under heavy 3D loads.

This card is nearly 9.1” long (23 cm) and requires a six-pin PCIe power connector. You’ll find splitter cables in the retail package for both power and component video, plus a DVI-to-HDMI adapter and an internal SPDIF cable to carry digital audio output. You’ll also find XIII Century: Death of Glory in the box as well.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
74 comments
Comment from the forums
    Your comment
  • Bloodblender
    All I can say is that Tom's recent articles have been an excellent read, and this exactly the stuff I (as well as many others) require for their research purposes. Keep up the great work!
    1
  • dirtmountain
    Nice article,very well done, but you need to show the 4670 in CF as costing $162, not $81 as shown in the final chart.
    3
  • rambo117
    the iceQ concept is amazing. keeps my 3870s nice and chilly (70C) while hardcore gaming
    and not to mention they both look intimidating in my case ;)
    -4
  • pij
    Quick question -

    4770 in crossfire or single 4890 best bet???..
    0
  • Anonymous
    to me the gaming benches are most important but energy efficiency and heat dissipation run a close 2nd. thanks for providing it all!
    1
  • Julianbreaker
    Newegg has quite a few 4850s that retail for $100 and it appears to be getting consistently better benchmarks than the 4770. I am confused as to why you would not recommend it over the 4770. Perhaps you are confused by simple maths.
    8
  • radiowars
    PijQuick question - 4770 in crossfire or single 4890 best bet???..

    They already did a whole article on that...
    -4
  • bucifer
    I don't understand why you still won't use the 1GB version of the Radeon 4870. It's clear to me that the card is limited by it's amount of video memory when using hi-res, AA and AF.
    Searching for prices in US and Europe it retails cheaper than the GTX260(192 or 216).
    The point is: the card should be included in the test just as the GTX260-216. It's clearly a better option than the 512 mb version and it's good for comparison!
    8
  • masterjaw
    Nice article here. Most importantly, no unnecessary bias included.
    -2
  • holodust
    Nice article, but I don't see how testing these cards on i7 920@3.8 fits into mainstream.
    8
  • Hamsterabed
    makes it a control to make sure they are only ratting the graphics cards and not the cpu. makes sure the GPUS are the limiting factor
    5
  • qwertymac93
    something is bothering me. i have left 4 dead, and when i play it at 8xAA, 16xAF, i get higher frame rates then you do(close to 100). i have a 4830 and the res i play at is 1440x900. i know its not the same as 1650x1050, but the extra AA and AF should at least keep them close, but i get over 20fps more then your 4850! i don't have a fancy i7 and still. my 4830 is clocked at 700/1000. did you set the aa/af in the drivers or in-game, because in-game is almost always better.
    0
  • Sihastru
    qwertymac93, probably your driver settings (optimizations) are overriding the in game engine settings.
    0
  • pij
    buciferI don't understand why you still won't use the 1GB version of the Radeon 4870. It's clear to me that the card is limited by it's amount of video memory when using hi-res, AA and AF. Searching for prices in US and Europe it retails cheaper than the GTX260(192 or 216).The point is: the card should be included in the test just as the GTX260-216. It's clearly a better option than the 512 mb version and it's good for comparison!



    Sorry I thought the article was 'mainstream graphics cards' not 4890 vs 2x4770's in crossfire ! Blimey I must be going mad.
    0
  • pij
    whoops wrong quote - how silly of me.
    -2
  • pij
    They already did a whole article on that...


    Sorry I thought the article was 'mainstream graphics cards' not 4890 vs 2x4770's in crossfire ! Blimey I must be going mad.
    1
  • amnotanoobie
    JulianbreakerNewegg has quite a few 4850s that retail for $100 and it appears to be getting consistently better benchmarks than the 4770. I am confused as to why you would not recommend it over the 4770. Perhaps you are confused by simple maths.


    The only reason that I could think of wherein the 4770 is better, is the smaller manufacturing process which should make it cooler and consume less power. Though if raw performance is your concern, the 4850 may be better.

    holodustNice article, but I don't see how testing these cards on i7 920@3.8 fits into mainstream.


    They usually do it on the highest rig they have to eliminate as much possible bottlenecks as possible. I think they were just making sure that video card's respective scores do not flat-out (i.e. the GTX 275, 260, 4870 and 4890 displaying the same scores when they are clearly a bit different hardware). Ideally even on lower-end hardware this chart should still show the same order in terms of performace, though you'd probably lose a few fps.
    3
  • qwertymac93
    gee, a thumbs down for asking a question, interesting crowd huh. no, all my driver settings are set to "application settings: and i can visually confirm that the AA is indeed working. i think its just a different level. the frame rates i was quoting was from the rooftop part.
    -2
  • haplo602
    amnotanoobieThey usually do it on the highest rig they have to eliminate as much possible bottlenecks as possible. I think they were just making sure that video card's respective scores do not flat-out (i.e. the GTX 275, 260, 4870 and 4890 displaying the same scores when they are clearly a bit different hardware). Ideally even on lower-end hardware this chart should still show the same order in terms of performace, though you'd probably lose a few fps.


    well then it begs the question, which card is more platform limited. I mean the driver may scale differently with CPU power, so the card winning on the overclocked i7 may actualy be the worst on a stock PII X3 720 BE, or X2 550 BE.

    thus testing mainstream GPUs on high-end platforms has a flaw here ...
    -2
  • bucifer
    Pij your IQ is below the sea level.
    masterjawNice article here. Most importantly, no unnecessary bias included.

    As for this statement I have one OBVIOUS mention. Why did they use The Last Remnant for testing again?
    3