GPU vs. CPU Upgrade: Extensive Tests

Comparison of Graphics Chips and Introduction of the Test Configuration

The Geforce 6 and 7 are not overclocked. If the Geforce 8800 GTS 512 OC was to be operated at the standard clocking rate, it would be as fast as the Geforce 8800 GT OC in terms of overall performance. The Geforce 9600 GT has 1024 MB of graphics memory, while the new Geforce 9800 GTX is competing for first place with the Geforce 8800 GTS 512. Which of these two models comes out on top depends on the clocking rate used—the two cards are too similar in terms of the technical specifications and clocking rates with G92 graphics chip, for any relevant performance differences to be identified.

Swipe to scroll horizontally
Card manufacturer and chipCode nameMemoryShaderGPU SpeedMemory Data Rate
Geforce 9800 GTXG92512 MB GDDR34.0, 1688 MHz675 MHz2200 MHz
Geforce 9600 GT OCG941024 MB GDDR34.0, 1680 MHz700 MHz1900 MHz
Geforce 8800 GTS OCG92512 MB GDDR34.0, 1825 MHz730 MHz1944 MHz
Geforce 8800 GT OCG92512 MB GDDR34.0, 1650 MHz660 MHz1900 MHz
Geforce 7950 GTG71512 MB GDDR33.0550 MHz1400 MHz
Geforce 6800 GTNV40256 MB GDDR33.0350 MHz1000 MHz

OC = overclocked (speed is higher than standard)
Memory Data Rate = physical clock rate times two

Swipe to scroll horizontally
CPUsIntel E2160@1.8, E2160@2.41,E6750@2.67,Q6600@2.4, Q6600@3.2,X6800EE@2.94
CoolerZalman 9700 LED
MotherboardAsus P5E3 Deluxe, PCIe 2.0 2x16, ICH9R
Chip setIntel X38
Memory2x1 GByte, Ballistix (Crucial Technology) 1.5 Volt,DDR3 1066 7-7-7-20 (2x533 MHz)
AudioIntel High Definition Audio
LANIntel 1000 Pro
Hard drivesWestern Digital WD5000AAKS 500 GByte, S-ATA, Cache 16 MB,Hitachi 120 GByte, S-ATA, Cache 8 MB
DVDGigabyte GO-D1600C
Power SupplyCoolerMaster RS-850-EMBA 850 Watt
Swipe to scroll horizontally
GraphicNvidia Forceware 174.53, 9800 GTX 174.74
Operating systemWindows Vista Enterprise
DirectX10
Chip set driverX38 Intel 8.3.1.1009

For the purpose of testing, all graphics cards used a version of the new 174 driver introduced with the 9600 GT. The Geforce 9800 GTX required Version 174.74 because the graphics chip was not included in official releases. The Microsoft Flight X SP2 and DX10 preview modes still have representation errors in the water pixel shader—as a result, the simulated waves are missing. In DX9 mode (Geforce 6 and 7) everything appears to be fine.

  • DjEaZy
    will there be a AMD/ATI roundup???
    Reply
  • randomizer
    That would simply consume more time without really proving much. I think sticking with a single manufacturer is fine, because you see the generation differences of cards and the performance gains compared to geting a new processor. You will see the same thing with ATI cards. Pop in an X800 and watch it crumble in the wake of a HD3870. There is no need to inlude ATI cards for the sake of this article.
    Reply
  • randomizer
    This has been a long needed article IMO. Now we can post links instead of coming up with simple explanations :D
    Reply
  • yadge
    I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4.

    Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
    Reply
  • This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).

    Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
    Reply
  • randomizer
    yadgeI didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4. Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
    Reply
  • wh3resmycar
    scyThis article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).
    maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
    Reply
  • randomizer
    If they had used P4s there would be o many variables in this article that there would be no direction and that would make it pointless.
    Reply
  • JAYDEEJOHN
    Great article!!! It clears up many things. It finally shows proof that the best upgrade a gamer can make is a newer card. About the P4's, just take the clock rate and cut it in half, then compare (ok add 10%) heheh
    Reply
  • justjc
    I know randomizer thinks we would get the same results, but would it be possible to see just a small article showing if the same result is true for AMD processors and ATi graphics.
    Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
    So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
    Reply