GPU vs. CPU Upgrade: Extensive Tests

Overclocking the E2160 Processor to 3 GHz

The E2160 has even more potential; further clocking it to 3 GHz should show whether or not the CPU is able to compete with the more expensive models. Here, the clock rate of 3000 MHz and the larger caches of the E6750 or Q6600 come into direct contrast. For comparison purposes, the faster Geforce 8800 GTS 512 OC has been used as the test graphics card.

Important! Overclocking components will result in a loss of warranty and an increase in temperatures. The standard cooler may not be able to compensate, so a Zalman 9700 LED was used in this test. When attempting to overclock, you should always monitor temperatures carefully.

Swipe to scroll horizontally
Crysis v1.21280x1024 0xAA, Trilinear High Quality1680x10500xAA, Trilinear High Quality1920x12000xAA, TrilinearHigh Quality1280x1024 0xAA, Trilinear Very High QualityTotal value n fpsPercent
8800 GTS OC (512 MB) E2160@1.823.020.519.817.380.6100.0
8800 GTS OC (512 MB) E2160@2.4129.826.224.220.9101.1125.4
8800 GTS OC (512 MB) E2160@3.034.830.125.822.5113.2140.4
8800 GTS OC (512 MB) E6750@2.6734.728.824.624.5112.6139.7
8800 GTS OC (512 MB) Q6600@3.236.630.224.824.4116.0143.9
8800 GTS OC (512 MB) X6800EE@2.9439.031.426.022.7119.1147.8

In Crysis, the E2160 at 3 GHz is able to catch up with the E6750 at 2.67 GHz (standard clock rate). The jump in performance from 2400 to 3000 MHz represents an additional gain of 15%. If you assume an E2160 standard clocking rate of 1800 MHz, this is a total of 40% more overall power that the graphics card is able to transfer to the screen. There is a gap of 4% and 8% compared to the Q6600 at 3200 MHz and the X6800EE, respectively. The expensive Extreme Edition appears to be favored by Crysis.

Swipe to scroll horizontally
World in Conflict 1.051280x10240xAA, TrilinearVery High Quality1680x10500xAA, TrilinearVery High Quality1920x12000xAA, TrilinearVery High QualityTotal value in fpsPercent
8800 GTS OC (512 MB) E2160@1.818.021.020.059.0100.0
8800 GTS OC (512 MB) E2160@2.4131.031.030.092.0155.9
8800 GTS OC (512 MB) E2160@3.040.038.037.0115.0194.9
8800 GTS OC (512 MB) E6750@2.6742.041.038.0121.0205.1
8800 GTS OC (512 MB) Q6600@3.247.044.039.0130.0220.3
8800 GTS OC (512 MB) X6800EE@2.9443.041.037.0121.0205.1

World in Conflict responds to raw clocking rates without antialiasing: overclocking from 1800 to 3000 MHz enables the E2160 to almost double overall performance. There is very little difference between this and the more expensive CPU models—the Q6600 OC at 3200 MHz provides the best results.

Swipe to scroll horizontally
World in Conflict 1.051280x10244xAA, 4xAFVery High Quality1680x10504xAA, 4xAFVery High Quality1920x12004xAA, 4xAFVery High QualityTotal value in fpsPercent
8800 GTS OC (512 MB) E2160@1.819.018.018.055.0100.0
8800 GTS OC (512 MB) E2160@2.4128.027.022.077.0140.0
8800 GTS OC (512 MB) E2160@3.033.028.025.086.0156.4
8800 GTS OC (512 MB) E6750@2.6734.028.024.086.0156.4
8800 GTS OC (512 MB) Q6600@3.236.030.023.089.0161.8
8800 GTS OC (512 MB) X6800EE@2.9434.029.025.088.0160.0

When using antialiasing, 2400 MHz is the basic level, the additional performance gain of up to 3000 MHz is lower. The graphics card appears to be the limiting factor. The gaps between the CPUs have become smaller: the Geforce 8800 GTS 512 OC with the E2160 at 3 GHz comes close to the level of a E6750 processor, which is almost three times the price.

  • DjEaZy
    will there be a AMD/ATI roundup???
    Reply
  • randomizer
    That would simply consume more time without really proving much. I think sticking with a single manufacturer is fine, because you see the generation differences of cards and the performance gains compared to geting a new processor. You will see the same thing with ATI cards. Pop in an X800 and watch it crumble in the wake of a HD3870. There is no need to inlude ATI cards for the sake of this article.
    Reply
  • randomizer
    This has been a long needed article IMO. Now we can post links instead of coming up with simple explanations :D
    Reply
  • yadge
    I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4.

    Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
    Reply
  • This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).

    Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
    Reply
  • randomizer
    yadgeI didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4. Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
    Reply
  • wh3resmycar
    scyThis article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).
    maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
    Reply
  • randomizer
    If they had used P4s there would be o many variables in this article that there would be no direction and that would make it pointless.
    Reply
  • JAYDEEJOHN
    Great article!!! It clears up many things. It finally shows proof that the best upgrade a gamer can make is a newer card. About the P4's, just take the clock rate and cut it in half, then compare (ok add 10%) heheh
    Reply
  • justjc
    I know randomizer thinks we would get the same results, but would it be possible to see just a small article showing if the same result is true for AMD processors and ATi graphics.
    Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
    So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
    Reply