GPU vs. CPU Upgrade: Extensive Tests

3D Performance for the CPU

These tables show the increase in overall 3D performance when the right graphics card is used with the CPU. The GeForce 6800GT sets the reference point for percentage increases, so that a rating of 220% marks a 120% increase from the base level. The percentage value is determined using the overall results of all games benchmarks achieved by a given combination of CPU and graphics card.

Swipe to scroll horizontally
Performance increase of E2160@1.8 by means of graphics cardPercent
Geforce 6800 GT (256 MB) E2160@1.8100.0
Geforce 7950 GT (512 MB) E2160@1.8220.5
Geforce 9600 GT OC (1024 MB) E2160@1.8386.4
Geforce 8800 GT OC (512 MB) E2160@1.8421.0
Geforce 8800 GTS OC (512 MB) E2160@1.8428.5
Geforce 9800 GTX (512 MB) E2160@1.8435.3
Swipe to scroll horizontally
Performance increase of E2160@2.41 by means of graphics cardPercent
Geforce 6800 GT (256 MB) E2160@2.41100.0
Geforce 7950 GT (512 MB) E2160@2.41235.0
Geforce 9600 GT OC (1024 MB) E2160@2.41446.6
Geforce 8800 GT OC (512 MB) E2160@2.41487.4
Geforce 8800 GTS OC (512 MB) E2160@2.41513.9
Geforce 9800 GTX (512 MB) E2160@2.41519.4
Swipe to scroll horizontally
Performance increase of E6750@2.67 by means of graphics cardPercent
Geforce 6800 GT (256 MB) E6750@2.67100.0
Geforce 7950 GT (512 MB) E6750@2.67239.7
Geforce 9600 GT OC (1024 MB) E6750@2.67504.5
Geforce 8800 GT OC (512 MB) E6750@2.67565.8
Geforce 9800 GTX (512 MB) E6750@2.67588.0
Geforce 8800 GTS OC (512 MB) E6750@2.67589.6
Swipe to scroll horizontally
Performance increase of Q6600@2.4 by means of graphics cardPercent
Geforce 6800 GT (256 MB) Q6600@2.4100.0
Geforce 7950 GT (512 MB) Q6600@2.4250.2
Geforce 9600 GT OC (1024 MB) Q6600@2.4515.4
Geforce 8800 GT OC (512 MB) Q6600@2.4575.6
Geforce 9800 GTX (512 MB) Q6600@2.4597.6
Geforce 8800 GTS OC (512 MB) Q6600@2.4599.7
Swipe to scroll horizontally
Performance increase of Q6600@3.2 by means of graphics cardPercent
Geforce 6800 GT (256 MB) Q6600@3.2100.0
Geforce 7950 GT (512 MB) Q6600@3.2248.6
Geforce 9600 GT OC (1024 MB) Q6600@3.2537.8
Geforce 8800 GT OC (512 MB) Q6600@3.2603.3
Geforce 9800 GTX (512 MB) Q6600@3.2634.3
Geforce 8800 GTS OC (512 MB) Q6600@3.2636.0
Swipe to scroll horizontally
Performance increase of X6800EE@2.94 by means of graphics cardPercent
Geforce 6800 GT (256 MB) X6800EE@2.94100.0
Geforce 7950 GT (512 MB) X6800EE@2.94240.7
Geforce 9600 GT OC (1024 MB) X6800EE@2.94511.3
Geforce 8800 GT OC (512 MB) X6800EE@2.94571.4
Geforce 9800 GTX (512 MB) X6800EE@2.94594.1
Geforce 8800 GTS OC (512 MB) X6800EE@2.94599.8

The best results of up to 636% were achieved by the Q6600 clocked to 3200 MHz, in conjunction with the Geforce 8800 GTS 512 OC. The E6750, Q6600 at the standard clocking rate and the X6800EE were very close in terms of test results, despite their prices ranging from $204 to $1240 (132 to 802) Euros. The results of the Geforce 9800 GTX and Geforce 8800 GTS 512 OC were very similar: with the weaker E2160, the Geforce 9800 GTX is a little faster, and with higher CPU clocking rates, the Geforce 8800 GTS 512 OC benefits from its heavily overclocked shader.

  • DjEaZy
    will there be a AMD/ATI roundup???
    Reply
  • randomizer
    That would simply consume more time without really proving much. I think sticking with a single manufacturer is fine, because you see the generation differences of cards and the performance gains compared to geting a new processor. You will see the same thing with ATI cards. Pop in an X800 and watch it crumble in the wake of a HD3870. There is no need to inlude ATI cards for the sake of this article.
    Reply
  • randomizer
    This has been a long needed article IMO. Now we can post links instead of coming up with simple explanations :D
    Reply
  • yadge
    I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4.

    Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
    Reply
  • This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).

    Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
    Reply
  • randomizer
    yadgeI didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4. Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
    Reply
  • wh3resmycar
    scyThis article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).
    maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
    Reply
  • randomizer
    If they had used P4s there would be o many variables in this article that there would be no direction and that would make it pointless.
    Reply
  • JAYDEEJOHN
    Great article!!! It clears up many things. It finally shows proof that the best upgrade a gamer can make is a newer card. About the P4's, just take the clock rate and cut it in half, then compare (ok add 10%) heheh
    Reply
  • justjc
    I know randomizer thinks we would get the same results, but would it be possible to see just a small article showing if the same result is true for AMD processors and ATi graphics.
    Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
    So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
    Reply