Editor’s Note: Prices in this article are presented mostly in Euros. The purpose of this article is to show relative pricing and relative price/performance ratios, not highly accurate current pricing. Therefore, we feel that Euro based pricing is acceptable here.
This is an endless topic of conversation, with everybody you meet having their own pet opinion. What brings better results, purchasing a faster graphics card, or investing your cash in a more powerful processor? In an effort to find out, Tom’s Hardware has taken a good look at the most important chips. In this article, the Geforce 6800 GT, 7950 GT, 8800 GT, 8800 GTS 512, 9600 GT 1024 and 9800 GTX are up for cross-testing in terms of performance comparisons, and pitted against current CPUs like the E2160, E6750, Q6600 and X6800EE.
The results should clarify how much performance is obtained by upgrading the various components, as well as which combinations work best or are cheapest. It will also show us how high the basic performance level of the processor needs to be in order for the new G92 graphics chips from Nvidia to develop their full 3D speed potential. The comprehensive tables and performance analyses show clear results detailing the effects of a CPU upgrade on Geforce 6 and 7, and whether it might just be better to go for the new graphics card generation with DirectX 10.
The test platform used is based on an X38 chipset with DDR3 memory and PCI Express 2.0 interface, and remains identical for all of the individual tests. Generally speaking, only the graphics and CPU performance is changed for each run. In order to ensure that the E2160 and Q6600 are able to keep up with the other processors, two additional test runs have been performed in which the Front Side Bus (FSB) was overclocked by 33-34%. The test results indicate whether simply overclocking the small cache budget CPU is able to compensate for its initial performance deficit, and how much performance the quad core is able to pull from its reserves as a result of the increased frequencies.
- Test Subjects: Four Generations of Nvidia Chips
- Comparison of Graphics Chips and Introduction of the Test Configuration
- Graphics Cards have More Potential
- CPU Power for the Graphics Cards
- 3D Performance for the CPU
- Benchmarks BlackSite Area 51 v1.2
- Call of Duty 4 v1.4
- Crysis v1.2
- Half Life 2 Episode 2
- Microsoft Flight Simulator X SP2
- Prey v1.4
- World in Conflict v1.05
- 3DMark06 1280x1024p v1.1.0
- What Advantages does Overclocking the CPU have for the Graphics Card?
- Overclocking the E2160 Processor to 3 GHz
- Overall Performance and Price Comparison
- Power Consumption, Noise Levels and Temperatures
- Overall Energy Consumption and Energy Saving
- 3D Performance Sorted According to Resolution and Anti-Aliasing
- Conclusions: Changing the Generation of Graphics Card has More Benefits

Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
But it lacks comparision with single core CPU's. In the times of 6800GT's popularity we've had single core Athlons like Barton, and perhaps early Athlons/Semprons with 754 socket. As adding AGP system would complicate things way too much, I think at least Athlon64 socket 939 could be used, as those were very popular in between/during 6800GT and 7xxx series.
At very least we should have one set of benchmarks on low/old CPU like that, so we can see if buying faster card is of any use at all, or will we reach a bottom of performance the same way that we see 6800GT unable to use additional power from new CPUs.
Other than this, it's a great article!
PLEASE - make one more roundup like this once GT200/4800 cards are out!
P.S. And nice to see that 9800GTX, and overclocked Q6600 are still right on 300W consumption, meaning any quality 400/450W supply is more than enough for them!
Seriously though ... cheers and thanks !!
Hey ... update the article with a couple of AMD cpu's thrown into the mix ... perhaps a 3800, 5000. and a 6400 and a couple of 50 series Phenoms.
Thanks again...
I wonder if Nehalem will be such a big boost if it will show gains across the board of GPUs ?
1. You can see on the cpu charts where the AMD you have compares with an intel...so it would perform about the same with the same card.
2. cpu's in the same FAMILY on the same architecture will scale the same between them relative to the way the c2 duo's scale to each other.
A game that's missing is UT3 -- it favorably benefits from X3 AMD CPUs as a cost-effective solution, whereas the X3 may not be as good in other games. But maybe that would throw off overall results if 1 game did really well and did not follow the same trends as other games.
and yes we need AMD CPUs to see the scaling effects across the different GPUs. The interaction of weak CPUs with strong GPUs or vice versa is not represented in the CPU charts.
speaking of which: when are the GPU charts going to get updated with modern GPUs and games?
that is something that i am really curious in. I've been bummed out that i killed my quad and back to dual. I guess its not really noticable..