Sign in with
Sign up | Sign in

Microsoft Flight Simulator X SP2

GPU vs. CPU Upgrade: Extensive Tests
By

The Geforce 6800 GT and 7950 GT only run with DirectX 9 effects. In this mode, the environment is not reflected in the water, but the waves are simulated cleanly by the pixel shader. In DirectX 10 mode, the landscape is reflected in the surface of the water. With the Forceware Version 174.74 graphics driver, the pixel shader still does not produce waves.

gpu vs cpu

The flight simulator reacts very heavily to the clocking rate of the CPU, the Q6600 with 2.4 GHz being slower than the E6750 at 2.67 GHz. Only when the Q6600 is overclocked to 3200 MHz does it take the top spot with the new G92 graphics chip. The Geforce 6800 GT barely reacts at all, and the Geforce 7950 GT reacts slightly to the better processor speed. The Geforce 9600 GT and its 1024 MB graphics memory achieve the best results, but only in conjunction with the E6750, Q6600 or X6800EE and at the 1920x1200 pixel resolution with antialiasing. If a weaker version of the E2160 is installed, the Geforce 9800 GTX with 512 MB graphics memory wins the race.

gpu vs cpu

gpu vs cpu

gpu vs cpu

gpu vs cpu

gpu vs cpu

gpu vs cpu

Display all 75 comments.
This thread is closed for comments
  • 2 Hide
    DjEaZy , May 15, 2008 6:40 AM
    will there be a AMD/ATI roundup???
  • 5 Hide
    randomizer , May 15, 2008 7:44 AM
    That would simply consume more time without really proving much. I think sticking with a single manufacturer is fine, because you see the generation differences of cards and the performance gains compared to geting a new processor. You will see the same thing with ATI cards. Pop in an X800 and watch it crumble in the wake of a HD3870. There is no need to inlude ATI cards for the sake of this article.
  • 5 Hide
    randomizer , May 15, 2008 7:47 AM
    This has been a long needed article IMO. Now we can post links instead of coming up with simple explanations :D 
  • 0 Hide
    yadge , May 15, 2008 8:00 AM
    I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4.

    Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
  • 0 Hide
    Anonymous , May 15, 2008 8:10 AM
    This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).

    Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
  • 0 Hide
    randomizer , May 15, 2008 8:25 AM
    yadgeI didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4. Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.

    I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
  • 0 Hide
    wh3resmycar , May 15, 2008 8:38 AM
    scyThis article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).


    maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
  • 1 Hide
    randomizer , May 15, 2008 8:40 AM
    If they had used P4s there would be o many variables in this article that there would be no direction and that would make it pointless.
  • 2 Hide
    JAYDEEJOHN , May 15, 2008 9:40 AM
    Great article!!! It clears up many things. It finally shows proof that the best upgrade a gamer can make is a newer card. About the P4's, just take the clock rate and cut it in half, then compare (ok add 10%) heheh
  • 0 Hide
    justjc , May 15, 2008 9:50 AM
    I know randomizer thinks we would get the same results, but would it be possible to see just a small article showing if the same result is true for AMD processors and ATi graphics.
    Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
    So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
  • 0 Hide
    randomizer , May 15, 2008 10:05 AM
    In the case of processors, throwing in some AMD chips would be a good idea as they are often a fair bit slower than a similarly priced C2D. However, I don't think having ATI cards in the mix would show up anything really different than what we have now. The higher end cards will be bottlenecked on a slow processor while the slower cards won't be bottlenecked as badly. A 3870X2 will need a more powerful CPU to reach maximum potential just like a 9800GX2. Of course, the amount of CPU power needed will almost definitely be different, but the overall conclusion is that buying a next-gen (or current-gen rather) card is going to benefit you more than a new CPU unless it's real old. That is all the article is trying to prove, not which CPU/video card is best.
  • 1 Hide
    LuxZg , May 15, 2008 11:14 AM
    I've got say I agree about no need to add ATI cards, as they are easily comparable with the selection of nVidia cards shown in article.

    But it lacks comparision with single core CPU's. In the times of 6800GT's popularity we've had single core Athlons like Barton, and perhaps early Athlons/Semprons with 754 socket. As adding AGP system would complicate things way too much, I think at least Athlon64 socket 939 could be used, as those were very popular in between/during 6800GT and 7xxx series.

    At very least we should have one set of benchmarks on low/old CPU like that, so we can see if buying faster card is of any use at all, or will we reach a bottom of performance the same way that we see 6800GT unable to use additional power from new CPUs.

    Other than this, it's a great article!

    PLEASE - make one more roundup like this once GT200/4800 cards are out!

    P.S. And nice to see that 9800GTX, and overclocked Q6600 are still right on 300W consumption, meaning any quality 400/450W supply is more than enough for them!
  • 2 Hide
    Reynod , May 15, 2008 12:06 PM
    Onya ... thanks for pinching the idea from our thread on the subject buddy !!! Tou guys must troll our topics looking for good ideas eh??

    Seriously though ... cheers and thanks !!

    Hey ... update the article with a couple of AMD cpu's thrown into the mix ... perhaps a 3800, 5000. and a 6400 and a couple of 50 series Phenoms.

    Thanks again...
  • 0 Hide
    cah027 , May 15, 2008 1:38 PM
    This is a really cool article. I think it should be updated with every new generation of new parts (ex: new ATi and Nvidia and Nahelem and Fusion)

    I wonder if Nehalem will be such a big boost if it will show gains across the board of GPUs ?
  • 0 Hide
    danatmason , May 15, 2008 1:58 PM
    I've been waiting for an article like this! Great to have figures to back up the idea of a cpu bottleneck. I tried to pair an 8800GT with an Athlon x2 4000+ at stock speed and I was HUGELY disappointed. But with a quick OC of the processor, I'm sitting happy! So I imagine those results scale similarly with AMD processors and the same idea - clock speed matters more for games, not cores - will still hold true.
  • 0 Hide
    royalcrown , May 15, 2008 2:32 PM
    there's no need to throw a bunch of AMD processors in here for one reason, well two...

    1. You can see on the cpu charts where the AMD you have compares with an intel...so it would perform about the same with the same card.

    2. cpu's in the same FAMILY on the same architecture will scale the same between them relative to the way the c2 duo's scale to each other.
  • 0 Hide
    ubbadubba , May 15, 2008 2:42 PM
    the E8xxx and E7xxx are not mentioned. Please either include in next edition (along with a few AMD CPUs) or comment on which parts they would mimic. Like, does the E8400 @ 3GHZ behave like a E2160 @3GHz, or is the E8400 still a little bit better by X%. E7200 and E8200 are fairly cheap for new CPU models, so that would be nice for the next go round.

    A game that's missing is UT3 -- it favorably benefits from X3 AMD CPUs as a cost-effective solution, whereas the X3 may not be as good in other games. But maybe that would throw off overall results if 1 game did really well and did not follow the same trends as other games.

    and yes we need AMD CPUs to see the scaling effects across the different GPUs. The interaction of weak CPUs with strong GPUs or vice versa is not represented in the CPU charts.

    speaking of which: when are the GPU charts going to get updated with modern GPUs and games?
  • 0 Hide
    Anonymous , May 15, 2008 2:48 PM
    What an incredible article! My jaw dropped to the floor at the cost per frame chart. Really nice work! Now, the holy grail is cross-linking article data like this to the CPU and Vid charts. I am very impressed.
  • 0 Hide
    a 6pack in , May 15, 2008 3:31 PM
    DjEaZywill there be a AMD/ATI roundup???

    that is something that i am really curious in. I've been bummed out that i killed my quad and back to dual. I guess its not really noticable..

  • 0 Hide
    hcforde , May 15, 2008 3:34 PM
    SCY,(15/05/2008@10:10) most people may have a greater tendancy to upgrade their GPU over their CPU. Maybe that is why it was done. Us techies may not see it that way but the general market may.
Display more comments