GPU vs. CPU Upgrade: Extensive Tests

Conclusions: Changing the Generation of Graphics Card has More Benefits

Geforce 6 and 7 cards are hardly suitable for current games and modern LCD resolutions—the higher graphics settings in the test show that the older graphics chips really have reached their limits. This is obvious, at the very latest, when you try to combine a Geforce 6800 GT with a more powerful CPU, which does not achieve any palpable increase in 3D performance.

However, games are not 100% dependent on the graphics card; the Geforce 8 and 9 require a basic level of power, otherwise they are unable to exploit their 3D potential. The speed of the CPU should lie somewhere between 2600 and 3000 MHz; any lower, and the new graphics chips lose considerable performance.

There is no obvious advantage to quad cores over dual cores, at least according to the graphics-based benchmarks. In order for the Q6600 to compete with the dual core E6750, the same clocking rate is recommended. If you wish to combine an E2160 with a Geforce 8800 or Geforce 9, you will need to overclock. Without a clock rate of at least 2400 MHz, you will lose a considerable amount of graphics performance, because the card is not fully loaded.

The difference in performance among CPUs costing $77, $268 or even $1,237 (50, 170, 800 Euros) is actually relatively low. If you compare an E2160 at 1800 MHz to an E6750 or Q6600, you will find a 30% difference in the overall results. If the E2160 is overclocked to 2400 MHz, though, the difference in overall results is just 15%. The smaller cache of the E2160 budget CPU can be overcome by a higher clocking rate of up to 3 GHz.

The change to a new generation of graphics card achieves more, but the CPU should still have sufficient brawn to provide the basic level the card requires. Changing from a Geforce 6800 GT to a current Geforce 8800 or 9800 can quintuple the overall results for 3D games. Changing from a Geforce 7950 GT to one of the new G92 graphic chips will at least double the overall results.

If you convert the frame rate to percentages in order to filter out the weighting caused by high fps figures, it is possible to obtain an increase of over 1100% by changing from a Geforce 6800 GT to a current Geforce 8800 or 9800. If you change from a Geforce 7950 GT to one of the new G92 graphics chips, you can obtain a performance increase of up to 180%, with improved DirectX 10 effects. The maximum possible values will depend on the CPU’s performance level.

  • DjEaZy
    will there be a AMD/ATI roundup???
    Reply
  • randomizer
    That would simply consume more time without really proving much. I think sticking with a single manufacturer is fine, because you see the generation differences of cards and the performance gains compared to geting a new processor. You will see the same thing with ATI cards. Pop in an X800 and watch it crumble in the wake of a HD3870. There is no need to inlude ATI cards for the sake of this article.
    Reply
  • randomizer
    This has been a long needed article IMO. Now we can post links instead of coming up with simple explanations :D
    Reply
  • yadge
    I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4.

    Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
    Reply
  • This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).

    Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
    Reply
  • randomizer
    yadgeI didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4. Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
    Reply
  • wh3resmycar
    scyThis article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).
    maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
    Reply
  • randomizer
    If they had used P4s there would be o many variables in this article that there would be no direction and that would make it pointless.
    Reply
  • JAYDEEJOHN
    Great article!!! It clears up many things. It finally shows proof that the best upgrade a gamer can make is a newer card. About the P4's, just take the clock rate and cut it in half, then compare (ok add 10%) heheh
    Reply
  • justjc
    I know randomizer thinks we would get the same results, but would it be possible to see just a small article showing if the same result is true for AMD processors and ATi graphics.
    Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
    So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
    Reply