Sign in with
Sign up | Sign in

Comparison of Graphics Chips and Introduction of the Test Configuration

GPU vs. CPU Upgrade: Extensive Tests
By

The Geforce 6 and 7 are not overclocked. If the Geforce 8800 GTS 512 OC was to be operated at the standard clocking rate, it would be as fast as the Geforce 8800 GT OC in terms of overall performance. The Geforce 9600 GT has 1024 MB of graphics memory, while the new Geforce 9800 GTX is competing for first place with the Geforce 8800 GTS 512. Which of these two models comes out on top depends on the clocking rate used—the two cards are too similar in terms of the technical specifications and clocking rates with G92 graphics chip, for any relevant performance differences to be identified.

Nvidia Graphics Cards
Card manufacturer and chip Code name Memory Shader GPU Speed Memory Data Rate
Geforce 9800 GTX G92 512 MB GDDR3 4.0, 1688 MHz 675 MHz 2200 MHz
Geforce 9600 GT OC G94 1024 MB GDDR3 4.0, 1680 MHz 700 MHz 1900 MHz
Geforce 8800 GTS OC G92 512 MB GDDR3 4.0, 1825 MHz 730 MHz 1944 MHz
Geforce 8800 GT OC G92 512 MB GDDR3 4.0, 1650 MHz 660 MHz 1900 MHz
Geforce 7950 GT G71 512 MB GDDR3 3.0 550 MHz 1400 MHz
Geforce 6800 GT NV40 256 MB GDDR3 3.0 350 MHz 1000 MHz

OC = overclocked (speed is higher than standard)
Memory Data Rate = physical clock rate times two

gpu vs cpu

Nvidia Graphics Card
CPUs Intel E2160@1.8, E2160@2.41,
E6750@2.67,
Q6600@2.4, Q6600@3.2,
X6800EE@2.94
Cooler Zalman 9700 LED
Motherboard Asus P5E3 Deluxe, PCIe 2.0 2x16, ICH9R
Chip set Intel X38
Memory 2x1 GByte, Ballistix (Crucial Technology) 1.5 Volt,
DDR3 1066 7-7-7-20 (2x533 MHz)
Audio Intel High Definition Audio
LAN Intel 1000 Pro
Hard drives Western Digital WD5000AAKS 500 GByte, S-ATA, Cache 16 MB,
Hitachi 120 GByte, S-ATA, Cache 8 MB
DVD Gigabyte GO-D1600C
Power Supply CoolerMaster RS-850-EMBA 850 Watt

Driver and Configuration
Graphic Nvidia Forceware 174.53, 9800 GTX 174.74
Operating system Windows Vista Enterprise
DirectX 10
Chip set driver X38 Intel 8.3.1.1009

For the purpose of testing, all graphics cards used a version of the new 174 driver introduced with the 9600 GT. The Geforce 9800 GTX required Version 174.74 because the graphics chip was not included in official releases. The Microsoft Flight X SP2 and DX10 preview modes still have representation errors in the water pixel shader—as a result, the simulated waves are missing. In DX9 mode (Geforce 6 and 7) everything appears to be fine.

gpu vs cpu

gpu vs cpu

gpu vs cpu

gpu vs cpu

gpu vs cpu

gpu vs cpu

Display all 75 comments.
This thread is closed for comments
  • 2 Hide
    DjEaZy , May 15, 2008 6:40 AM
    will there be a AMD/ATI roundup???
  • 5 Hide
    randomizer , May 15, 2008 7:44 AM
    That would simply consume more time without really proving much. I think sticking with a single manufacturer is fine, because you see the generation differences of cards and the performance gains compared to geting a new processor. You will see the same thing with ATI cards. Pop in an X800 and watch it crumble in the wake of a HD3870. There is no need to inlude ATI cards for the sake of this article.
  • 5 Hide
    randomizer , May 15, 2008 7:47 AM
    This has been a long needed article IMO. Now we can post links instead of coming up with simple explanations :D 
  • 0 Hide
    yadge , May 15, 2008 8:00 AM
    I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4.

    Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
  • 0 Hide
    Anonymous , May 15, 2008 8:10 AM
    This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).

    Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
  • 0 Hide
    randomizer , May 15, 2008 8:25 AM
    yadgeI didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4. Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.

    I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
  • 0 Hide
    wh3resmycar , May 15, 2008 8:38 AM
    scyThis article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).


    maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
  • 1 Hide
    randomizer , May 15, 2008 8:40 AM
    If they had used P4s there would be o many variables in this article that there would be no direction and that would make it pointless.
  • 2 Hide
    JAYDEEJOHN , May 15, 2008 9:40 AM
    Great article!!! It clears up many things. It finally shows proof that the best upgrade a gamer can make is a newer card. About the P4's, just take the clock rate and cut it in half, then compare (ok add 10%) heheh
  • 0 Hide
    justjc , May 15, 2008 9:50 AM
    I know randomizer thinks we would get the same results, but would it be possible to see just a small article showing if the same result is true for AMD processors and ATi graphics.
    Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
    So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
  • 0 Hide
    randomizer , May 15, 2008 10:05 AM
    In the case of processors, throwing in some AMD chips would be a good idea as they are often a fair bit slower than a similarly priced C2D. However, I don't think having ATI cards in the mix would show up anything really different than what we have now. The higher end cards will be bottlenecked on a slow processor while the slower cards won't be bottlenecked as badly. A 3870X2 will need a more powerful CPU to reach maximum potential just like a 9800GX2. Of course, the amount of CPU power needed will almost definitely be different, but the overall conclusion is that buying a next-gen (or current-gen rather) card is going to benefit you more than a new CPU unless it's real old. That is all the article is trying to prove, not which CPU/video card is best.
  • 1 Hide
    LuxZg , May 15, 2008 11:14 AM
    I've got say I agree about no need to add ATI cards, as they are easily comparable with the selection of nVidia cards shown in article.

    But it lacks comparision with single core CPU's. In the times of 6800GT's popularity we've had single core Athlons like Barton, and perhaps early Athlons/Semprons with 754 socket. As adding AGP system would complicate things way too much, I think at least Athlon64 socket 939 could be used, as those were very popular in between/during 6800GT and 7xxx series.

    At very least we should have one set of benchmarks on low/old CPU like that, so we can see if buying faster card is of any use at all, or will we reach a bottom of performance the same way that we see 6800GT unable to use additional power from new CPUs.

    Other than this, it's a great article!

    PLEASE - make one more roundup like this once GT200/4800 cards are out!

    P.S. And nice to see that 9800GTX, and overclocked Q6600 are still right on 300W consumption, meaning any quality 400/450W supply is more than enough for them!
  • 2 Hide
    Reynod , May 15, 2008 12:06 PM
    Onya ... thanks for pinching the idea from our thread on the subject buddy !!! Tou guys must troll our topics looking for good ideas eh??

    Seriously though ... cheers and thanks !!

    Hey ... update the article with a couple of AMD cpu's thrown into the mix ... perhaps a 3800, 5000. and a 6400 and a couple of 50 series Phenoms.

    Thanks again...
  • 0 Hide
    cah027 , May 15, 2008 1:38 PM
    This is a really cool article. I think it should be updated with every new generation of new parts (ex: new ATi and Nvidia and Nahelem and Fusion)

    I wonder if Nehalem will be such a big boost if it will show gains across the board of GPUs ?
  • 0 Hide
    danatmason , May 15, 2008 1:58 PM
    I've been waiting for an article like this! Great to have figures to back up the idea of a cpu bottleneck. I tried to pair an 8800GT with an Athlon x2 4000+ at stock speed and I was HUGELY disappointed. But with a quick OC of the processor, I'm sitting happy! So I imagine those results scale similarly with AMD processors and the same idea - clock speed matters more for games, not cores - will still hold true.
  • 0 Hide
    royalcrown , May 15, 2008 2:32 PM
    there's no need to throw a bunch of AMD processors in here for one reason, well two...

    1. You can see on the cpu charts where the AMD you have compares with an intel...so it would perform about the same with the same card.

    2. cpu's in the same FAMILY on the same architecture will scale the same between them relative to the way the c2 duo's scale to each other.
  • 0 Hide
    ubbadubba , May 15, 2008 2:42 PM
    the E8xxx and E7xxx are not mentioned. Please either include in next edition (along with a few AMD CPUs) or comment on which parts they would mimic. Like, does the E8400 @ 3GHZ behave like a E2160 @3GHz, or is the E8400 still a little bit better by X%. E7200 and E8200 are fairly cheap for new CPU models, so that would be nice for the next go round.

    A game that's missing is UT3 -- it favorably benefits from X3 AMD CPUs as a cost-effective solution, whereas the X3 may not be as good in other games. But maybe that would throw off overall results if 1 game did really well and did not follow the same trends as other games.

    and yes we need AMD CPUs to see the scaling effects across the different GPUs. The interaction of weak CPUs with strong GPUs or vice versa is not represented in the CPU charts.

    speaking of which: when are the GPU charts going to get updated with modern GPUs and games?
  • 0 Hide
    Anonymous , May 15, 2008 2:48 PM
    What an incredible article! My jaw dropped to the floor at the cost per frame chart. Really nice work! Now, the holy grail is cross-linking article data like this to the CPU and Vid charts. I am very impressed.
  • 0 Hide
    a 6pack in , May 15, 2008 3:31 PM
    DjEaZywill there be a AMD/ATI roundup???

    that is something that i am really curious in. I've been bummed out that i killed my quad and back to dual. I guess its not really noticable..

  • 0 Hide
    hcforde , May 15, 2008 3:34 PM
    SCY,(15/05/2008@10:10) most people may have a greater tendancy to upgrade their GPU over their CPU. Maybe that is why it was done. Us techies may not see it that way but the general market may.
Display more comments