Sign in with
Sign up | Sign in
Your question

GeForce 8800 Needs The Fastest CPU

Last response: in Graphics & Displays
Share
November 29, 2006 10:06:40 AM

We've showed you how fast Nvidia's newest ponies can gallop, but what happens when you upgrade to the fastest platform available? Do you really need the fastest CPU for the fastest GPU?
November 29, 2006 10:33:23 AM

Thanks for writing this up. I am looking around to upgrade my vid card for xmas and I was thinking strongly about the 8800GTX. With my rig though I might just have to settle for X1950XTX CrossFired. I really wanted to jump into DX10 territory though.
November 29, 2006 11:04:38 AM

Oh well, I guess I'll just get an E6600 and overclock it once I'll upgrade to this DX10 cards.
Related resources
November 29, 2006 11:05:20 AM

With an Core 2 Duo E6600 you would still get just as much or more power than cross fired X1959XTX. I'm going for a 8800GTS With the E6600 and 2GB DDR2 Ram in about a fortnight, with my new build.
And if you find it being bottlenecked to much you can overclock the processor. most E6600's would clock to 3.00GHz, (333MHz x9 =3.0GHz) easily. Which is what i plan to do.
November 29, 2006 11:12:40 AM

I wondered why THG has stuck with the AMD powered system when it's fairly obvious to just about everyone with a brain that a current Conroe based Intel platform offered more potential. Once I saw the review on the GF8800 cards on [H] I knew for certain that those cards were more powerful than what Toms had shown originally.
November 29, 2006 11:30:40 AM

Well looks like I'm set. I have a E6600 overclocked to 3.0 Ghz. I just want some DX10 games now, there's nothing out to truely test the 8800's capabilities.
November 29, 2006 11:41:40 AM

Only if you are going to game at 1024x768 or 1280x1024 all your life.... but looking at the higher resolutions it is not imperative to have the best of the best CPU for these GPU's.

And to be honest the FX-60 isn't exactly a budget CPU at the moment either, perhaps a comparison between a E6300 and x6800 would have been better. Or if you wanted an AMD chip as the budget CPU try one of the least expensive ones like the AMD 64 X2 3800+ AM2 CPU, which retails around £108, but another £3 pounds would get you an E6300, so budget/performance wise I would go for the E6300.
November 29, 2006 11:48:59 AM

This article makes me sad

*glares at his AGP 9800 Pro*
November 29, 2006 11:55:40 AM

I'd like to see the impacts on Oblivion for using a Quad Core, as it benefits from multiple core. It should get higher framerates. :) 
November 29, 2006 11:58:22 AM

You don’t need the fastest CD2. The 6600 will give you very close numbers to the 6800 and save you $600.
November 29, 2006 12:03:34 PM

I will just sit back with my San Diego and x800xtpe and enjoy CSS while the prices go up and down and up and down and up and down and down and down and down then a new major upgrade catastrophe happens

and then i will buy the DX10 R699 XTXXXTXTX which by that time will be on sale at Bestbuy for$149.99 then il go down the street, grab 2gb of DDR2 a quad core intel processor and top of the line mobo with a 1k psup while everyone else is buying their DDR3 Socket 666 Motherboards and 5kPsups

then i go home and get every game you turkeys have been playing the whole time... yes... it will be a splendid time indeed

:twisted:
November 29, 2006 12:05:03 PM

I've got no problem with this article, but the title is more than a bit misleading. Saying GeForce 8800 Needs The Fastest CPU tends to suggest that there is no point to the 8800 unless you have the Core 2 Extreme X6800. This statement is clearly false.

For the most demanding game on the market at the moment (Oblivion- Outdoors), for all but 2560 x 1600 res the game is running above 30 fps using the AMD chip (you dont state whether this is average or minimum).

The X1950XTX only achieves this at a paltry resolution of 1024 x 768 in comparison. Put another way, the 8800 GTX can cope with no visible change in performance in all but the highest res using the old CPU. This does not suggest that it needs the fastest cpu. Quite the opposite in fact.

Equally, your article doesnt even mention potential gains for using the Conroe chips or the Allendales. And, again, no mention of the potential for overclocking these chips to get the most from your 8800GTX.

And it is this last point that is the useful thing to come out of the article. When a game is released that is even more demanding than Oblivion outdoors, or when games are eventually engineered to use multiple cores (let alone quad), then the 8800 series will come into a league on its own with the quad processor- it has plenty in the tank left to give. But for now, whether it does 35, 55 or 155 is completely irrelevant.. it performs quick enough for gaming.

So, your summary
Quote:
If you don't do the job properly, the net effect will be like hooking up a pair of garbage speakers to a Bose or Klipsch sound system. The effect would be the same... less than optimal performance, and an experience that is far from ideal given the money you spent
is just plain wrong. Maybe what would be more correct summary would be, in order to get the best fps in a score on the most demanding game at the momeent requires the fastest cpu. But, ultimately this will not impact your gaming experience in all but the highest resolution.

Unless your looking to run at a monstor resolution, the 8800 GTX will destroy all games on the market with most of the settings maxed out. And in two years when software game developers have the caught up with the technology, maybe then you'll Need the fastest CPU. But until then, you most certainly don't- your results tell us that.
November 29, 2006 12:11:53 PM

Question: What did you have to do to get the 9700 cooler onto the eVGA m/b? I am hearing that one has to dremel the bracket that goes on the back of the board.

Michael
November 29, 2006 12:14:52 PM

I agree with what you said here. If a person is trying to achive the record in a bench than the fastest cpu is required. Gaming is gaming and even with a lower end cpu and the 8800 cards there will still be an increase in fps nomatter what tom writeup may say. That is how all upgrades has been and that wont change the way the 8800 works wiff any cpus.
November 29, 2006 12:18:27 PM

Quote:
I agree with what you said here. If a person is trying to achive the record in a bench than the fastest cpu is required. Gaming is gaming and even with a lower end cpu and the 8800 cards there will still be an increase in fps nomatter what tom writeup may say. That is how all upgrades has been and that wont change the way the 8800 works wiff any cpus.


Isn't the fastest CPU the QX6700 anyway, and not the X6800? 0,o
November 29, 2006 12:23:34 PM

I have no idea what is anymore I stopped looking at the fastest and looking for the cheapest and achive similar to the fastest, thats how I do it now.
a b U Graphics card
November 29, 2006 12:23:46 PM

i hope my fx60 oops 4400 oc,ed to 2.6.
is fast enough.
November 29, 2006 12:28:01 PM

sure it is I believe what is said in toms is a selling tool for nvida and intel. I dont believe that nvida made this card to only work with a 1000 dollar processor and would you not think nvida would say you need a 6600 cpu to run it.
November 29, 2006 12:29:13 PM

As enticing as the 8800 GTS numbers are, I think I'll sit this round out. Okay not completely out, but I will give my AGP system a bit longer to live.

Upgrading my system to a GTS would cost me at Minimum of $600 and that's if I keep my 3800+ x2 and my ddr400. I would also have to reinstall my OS and apps. How much does that cost?

On the other hand I can swap out my 6800gt for an x1950 AGP for half that and get by for another year or so. By that time we will probably have a better idea of what PCIe 2.0 is going to be like, Quad Cores will be cheaper, DX10 drivers will be more mature and Vista SP1 should be out.

No, I think I'll wait.
a b U Graphics card
November 29, 2006 12:30:51 PM

yeah i will go to a dx10 card but probably not the top card.

maybe a 8600gt or x2000gt or what ever the midrange
cards are going to be called.
a b U Graphics card
November 29, 2006 12:32:23 PM

Quote:
I would also have to reinstall my OS and apps. How much does that cost?


i have found it cost more time than money.
but arent they the same :lol: 

uh gotta go.
got customers dropping off there cars and i better get to work.
im here alone today and got exhaust jobs,no starts,coolingsys checks
and a bunch of other stuff.

busier than a hooker in a room full of kennedys :lol: 
November 29, 2006 12:39:24 PM

Would have been nice to see the X6800 overclocked just to tip the results even more.

Quote:
sure it is I believe what is said in toms is a selling tool for nvida and intel I dont believe that nvida made this card to only work with a 1000 dollar processor


This is the top of the range card. Why buy this and skimp on the rest of your system.
Also you can buy the E6300 and overclock it to over the speed of a X6800. Thats a $180 processor.
November 29, 2006 12:44:11 PM

Quote:
This article makes me sad

*glares at his AGP 9800 Pro*


Been there, had that.

Upgraded recently to an OC'd E6600 and a 7950 GTX... Noticible improvement overall.


Oh, BTW:

Quote:
the net effect will be like hooking up a pair of garbage speakers to a Bose or Klipsch sound system


Bose sux. ;) 
November 29, 2006 12:45:57 PM

Quote:
We've showed you how fast Nvidia's newest ponies can gallop, but what happens when you upgrade to the fastest platform available? Do you really need the fastest CPU for the fastest GPU?


I think everyone is missing the point of the original poster, the current fasted GPU available is the Nvidia 8800 GTX, his question is if you get this GPU, regardless of what you intend to do, ie, wait for R600 ( or R1000 for that matter or a 9900 ...:p ), do you need the Best CPU currently on the market.

In my opinion, i do not believe you do: the reason being the only reason to get this card is the ability to game at insane resolutions 1900x1200 and beyond, if you game below this, with the 8800 GTX then you are doing this card an injustice. At these resolution levels the games become GPU bound rather than CPU bound.

And as we can see from the article itself the results do indeed show that the frame rates are within the same realms when we have the resolutions above 1900x1200.

the only other reason to get the Best of the Best with this card is bragging rights.... but one shall not get into that as this subject is already starting to ire and irk me somewhat.
November 29, 2006 12:48:34 PM

That's weird. I wasn't aware that there was an X1950 AGP card out 8O

I also disagree with the 8800 needing the "fastest CPU". I mean, WHO THE H3LL GAMES AT 2560 x 1600???

Anyone who has a Conroe or Allendale, and clock it to 3.0 GHz I think will just be fine. I wish THG would do a test like that! :D 
November 29, 2006 12:58:13 PM

Quote:
That's weird. I wasn't aware that there was an X1950 AGP card out 8O

I also disagree with the 8800 needing the "fastest CPU". I mean, WHO THE H3LL GAMES AT 2560 x 1600???

Anyone who has a Conroe or Allendale, and clock it to 3.0 GHz I think will just be fine. I wish THG would do a test like that! :D 


Well some reflection from the readers helps in an article like this.
It implys that using the top end CPU you will get the best results.
You don't have to have it, but then again, if you are spending stupid amounts of money on this enthusiast card, you should be allowing a small budget for a good cpu.

The core2 cpu has more advantages at lower resolutions. So if you are a frame rate junkie, this is the cpu for you.

Also i agree with your last point. Anyone with a brain though will know a E6300 can OC to the X6800 speeds.

Thus we can deduce that the best processor for this card is any core2duo!
Personally the E6600 would probably be the best choice to go with this card.
Infact by buying the 6600 over the 6800 you will save enough money to buy this card!
November 29, 2006 1:02:03 PM

It'd be nice if you labeled your y axix with the percentages on your graphs. Also, there are a few minor typos in your article.
November 29, 2006 1:13:39 PM

Quote:

It implys that using the top end CPU you will get the best results.
You don't have to have it, but then again, if you are spending stupid amounts of money on this enthusiast card, you should be allowing a small budget for a good cpu.

Also i agree with your last point. Anyone with a brain though will know a E6300 can OC to the X6800 speeds.

Thus we can deduce that the best processor for this card is any core2duo!
Personally the E6600 would probably be the best choice to go with this card.
Infact by buying the 6600 over the 6800 you will save enough money to buy this card!


You are spot on. Most enthusiasts will be able to look behind the attention grabbing headlines, and false conculsions; the test results speak for themselves in that regard.

Unfortunately, a great deal of people did into this site without fully understanding the technology- read an article like that and then assume its gospel (without really understanding it). For them- THG can be more damaging than helpful.

I am completing a build for a casual gamer (never played oblivion) who is convinced that the cpu is as important as the graphics card. They dont multi task, video edit or do anything processor intensive. But, they want a future proofed pc without spending stupid money. This type of article just reinforces they're cpu imsconceptions.
November 29, 2006 1:21:06 PM

Mr Polkowski I am sorry, but I find your written article to be the worst I have ever read yet on any serious hardware site. Sorry about my bad grammar. English is not my native language. My main consern is:

Nr 1. You never refer to where your testpoints and scores came from. You could have just made them up or used any setting unkown to us the readers. You have to write down what you have tested in a test. otherwise the test is totaly useless. Solution = Write down your settings.

Nr 2.
In your cpu difference table you type down 60% higher in 1280. 60% what and from what? Have you tested with cpu on or off? Amd versus Intel? what have you tested? average fps you write in the text down under. High or low graphics. In the article it´s more like medium 4 - 8 and low no aa -af.

and this is before you start adding better graphics from the modding scene. Many games look twice as good with mods. But gpu takes a beating. ( oblivion best example)

GeForce 8800 Needs The Fastest CPU... this is just not true if I may say so. Why does it need it?

Is it if you are one of the eh.. few.. or none. Who buy a 600$-1000$ gpu to run their stuff at low quality settings.

With the af and aa set at x16 x16 everything set to max quality on the cards drivers and in game. CPU make in fact almost no difference in the tests made so far on the net and at home. The games are still gpu bound on high quality settings. Performance of a 90$ cpu is the same as a 900$ when the settings go high. You can test this yourself and everyone else can do it at home.

To make up for this article I want you to write down a new article.
GeForce 8800 Needs only a modest 99$ CPU to max out image quality in games.

If you readers buying 8800 cards read this. Ask yourself do I want 160 fps good quality with a 900$ cpu or 90 fps of the best quality with a 99$ cpu. I would go for the later.

Ps. the new cpu´s are still monsters when it comes to hard work like video editing, encoding and 3d. But as a gamer you don´t need them... and with dx 10 you might end up needing them even less for gaming.

8800 GTX can not run Oblivion with max candy at even 1024 x 800 16af -16aa with beauty mod add ons like 4096 textures and complete sceen renderings. Even fear will start to struggle above 1280. why does everyone use 4-8 max when there is 16 -16 max + high quality in the drivers? SoftShadows makes FEAR gpu bound. Why no softshadows? it looks great. I have yet to see a new game that runs better on a 4 ghz core2 than on my 2,7 ghz Amd64 3500. Please prove me wrong if you can.

Sorry if I sounded rude, but reviewers have to open up their eyes and see the larger picture. What is useful for the customer. Many other reviewers ( in fact all of them it seems) make the same misstake. You can be the first to make a difference.
a b U Graphics card
November 29, 2006 1:53:26 PM

I agree with most of what you said.Tho a new c2d isnt required for a good fps,it IS required for the future.The 8800 is the FIRST dx10 out,its only going to get better.The r600 may show an even larger need for a better cpu,as well as any new game that may soon appear.For the right now?Its true you neednt get a new cpu for your eye candy at low res...but to future proof...?
November 29, 2006 2:03:24 PM

no one knows if they will even be part of the future.
DX10 will most likely make games less cpu bound not more. Oblivion for example can on dx10 have it´s nature made on the fly within the gpu instead of with calculations being made outside the card. ( as long as the scenery doesn´t have advanced physics. oblivions havent. It would be like how in some games there are an endless amount of levels made up on the fly, gauntlet etc. same thing but prettier.
November 29, 2006 2:04:30 PM

Wow... the fastest CPU will allow your video card to run faster...
Is this news? :p 
What is somewhat informative though is the discrepancy between ATI and Nvidia with regard to their cpu utilization.
This really can go either way, it's either a good thing or a bad thing depending on the system.
My feeling has, and will always be, that dollar for dollar you're better off buying a faster video card on an older cpu, than a newer cpu on an older video card if you really want to push game performance/frame rates. Which, I suspect is much what everyone else does. In some cases, BF2 comes to mind, ram itself will make an enormous difference.
Again, that's not to say "cheap out on the cpu", what I am saying is that I'd rather pay $150 for a cpu (and, yeah I'll probably OC it) and $400 for a videocard, then split the money even, or skew it the other way.
Especially if it means having to upgrade an entire platform... 939 to am2 / Intel C2D for example.
Of course, my motivation is as much value as possible and trying to keep as much money in my pocket, to actually buy the software I bought all this crap for.

If you're rich... if you can afford it... go for it all :p 
November 29, 2006 2:06:43 PM

Cheap solution:

6300 OC'd to 3.0 FTW!
November 29, 2006 2:08:57 PM

Quote:
Its true you neednt get a new cpu for your eye candy at low res...but to future proof...?


Future proofing to me means spending money now that could save you money in the future (ie by not upgrading). But when your talking about a $400/$500 increase in cpu cost, Im fairly confident that by getting a cheaper cpu now (that will do the job perfectly well for gaming)- with the natural fall in prices you will save money in the long run by upgrading in 2/3 years.

On the otherhand- the 8800 dx10 gpu- which has a significant, tangible impact on games that are with us already- so you'll see the benefits instantly- seems like a sound piece of future proofing.

For me, the extreme 6800 chip is for the 0.1% of gamers running at a monster resolution; or for someone who is running processor intensive desktop apps.
November 29, 2006 2:19:13 PM

I've beendoing 3D animation for TV since the first Video Toaster had ten cards out, in 1990. I had two of them.

I can't tell any meaningful difference between the NVidia DX10 and ATI DX9 graphic. Putting the graphics through an oscilloscope also shows no noticible differences.
November 29, 2006 2:27:04 PM

Quote:
Its true you neednt get a new cpu for your eye candy at low res...but to future proof...?


Future proofing to me means spending money now that could save you money in the future (ie by not upgrading). But when your talking about a $400/$500 increase in cpu cost, Im fairly confident that by getting a cheaper cpu now (that will do the job perfectly well for gaming)- with the natural fall in prices you will save money in the long run by upgrading in 2/3 years.

On the otherhand- the 8800 dx10 gpu- which has a significant, tangible impact on games that are with us already- so you'll see the benefits instantly- seems like a sound piece of future proofing.

For me, the extreme 6800 chip is for the 0.1% of gamers running at a monster resolution; or for someone who is running processor intensive desktop apps.

Lol, I wish I could be so succinct.
You managed to say exactly what I've been wanting to say in about %50 of the space I take to say it :p 
November 29, 2006 2:38:57 PM

Of course not. The difference lies in how the graphics are produced. Would you se a difference when watching a movie in 2 frames/sec versus 24 frames / sec. You would. Therefore 2 fps in dx9 would not be ok if dx10 optimization gives 24 fps.

Solution = make worse gfx dx9 and 24 fps.

But most people will have dx9 so games will be made for that for years to come. dx10 will only be dx10 light for many years to come. But i hope dx10 wil be used to showcase candy at least. But greater use like on the fly environments will probably need dx10 only games... not likely to happen soon.
November 29, 2006 2:47:41 PM

Quote:
I've beendoing 3D animation for TV since the first Video Toaster had ten cards out, in 1990. I had two of them.

I can't tell any meaningful difference between the NVidia DX10 and ATI DX9 graphic. Putting the graphics through an oscilloscope also shows no noticible differences.


eh.... why would you look at your graphics through an o-scope?? Moreover, I'm sure there'd be differences between ATi and nVidia signals, but they'd be damned near impossible to decipher on the scope.....
November 29, 2006 2:51:42 PM

Playing BFME2 I noticed that x64 runs slower than x32. IE, P4 faster than X2 AMD. Tested using fire rate of archer units... anyway. Friend tested Vista x64, no fire rate issues.

My question is this... Would it be possible for Toms Hardware to install an x64 OS to test the full power of the AMD cores? I understand the x32 should be "nearly as good as native" however, I'm not convinced of this. Any thoughts about when/if you'll ever do such a review?

Thank you.
November 29, 2006 2:54:51 PM

omg. being bottlenecked by the processor? so it seems as if we do need CPU horsepower? god, I've only been preaching this for quite some time now :roll: :roll: :roll: :roll:
November 29, 2006 3:00:28 PM

Quote:
I've beendoing 3D animation for TV since the first Video Toaster had ten cards out, in 1990. I had two of them.

I can't tell any meaningful difference between the NVidia DX10 and ATI DX9 graphic. Putting the graphics through an oscilloscope also shows no noticible differences.


Well, I know nothing about the 3d animation process (although Im jealous.. sounds like you get to be a kid for life :lol:  )

What I would say though is it depends entitely what you are 'doing' in 3D animation. If you are getting a program to render objects (as per a computer game)- you will definitely get an improvement in fps moving from the 8800 to a dx9 card. This may be completely irrelevant if both fps are above 30 (ish) becuase the human eye wont be able to detect any improvement.

If in terms of 3D animation you are talking about compliling a end product (say into a movie), then that is entirely cpu / ram dependent.

And if your talking about watching 3d animation from a movie format- most new gpu's can do that comfortably (obviously depending on resolution).
November 29, 2006 3:06:31 PM

GeForce 8800 Needs The Fastest CPU = wrong and unfair to readers and potential buyers.

GeForce 8800 Loves The Fastest CPU = fair statement

maia
November 29, 2006 3:07:21 PM

Quote:
omg. being bottlenecked by the processor? so it seems as if we do need CPU horsepower? god, I've only been preaching this for quite some time now :roll: :roll: :roll: :roll:


You are wrong and have been that for quite some time it seems unless you run things on lower quality settings.
Anonymous
a b U Graphics card
a b à CPUs
November 29, 2006 3:08:30 PM

Yup...

At least I am happy they did an article explaining why the hell they used a FX60 =).

FEAR was pretty impressive to me!

Looks like a C2D6600 @ 3.6 with GTS will do the trick 8)

Now if you want lots of bungholio marks you need a quad core!
November 29, 2006 3:14:31 PM

Quote:
That's weird. I wasn't aware that there was an X1950 AGP card out 8O

I also disagree with the 8800 needing the "fastest CPU". I mean, WHO THE H3LL GAMES AT 2560 x 1600???


A Tom article even noted that most gamers still use 1024x768. So yeah, why showcase 2560x1600 if not just for the uber geek factor? :?
November 29, 2006 3:25:58 PM

Quote:
omg. being bottlenecked by the processor? so it seems as if we do need CPU horsepower? god, I've only been preaching this for quite some time now :roll: :roll: :roll: :roll:


You are wrong and have been that for quite some time it seems unless you run things on lower quality settings.


Examine that article again. Play close attention to the 1600x1200 + resolutions for FEAR, doom3 and oblivion (basically games that tax the system) the difference between 26fps and 39fps in oblivion is the difference between playable and not playable. Higher resolutions still require a decent cpu.

I'm not saying the 8800GTX requires a QC cpu, however look at the differences between the lower end AMD and Intel CPU compared to better ones in their class. Quite the performance increase I might say.

so in the words of elmer fudd, .....shaddup rabbit.
November 29, 2006 3:47:30 PM

The title of this article is very misleading.

Of course the faster the CPU the better your frame rate. And everybody knows that a 8800 will give better results with a faster CPU.

However it doesn't need the fastest CPU to see tangible and very real performance gains. Anybody with a faster AM2, or Core 2 system will see an increase in performance with an 8800 series card. Let's be honest... If a graphics card NEEDS the CPU that will allow it to use it's full potential then there isn't a CPU on the market out good enough for the 8800 series cards. For the past several releases GPU's have been ahead of CPU's.

I'm still stuck on an AM2 system with a 4200+ OC'd to fx-62 speeds. I stepped up from a 7900 card to an 8800gts and saw a large performance boost.

Now I could scrap my mobo/cpu and go with a core2 system... but I don't NEED to.
November 29, 2006 4:03:29 PM

Quote:

However it doesn't need the fastest CPU to see tangible and very real performance gains.


Technically there is no such thing as 'need' however having a well balanced system is essential imo.


Quote:

Let's be honest... If a graphics card NEEDS the CPU that will allow it to use it's full potential then there isn't a CPU on the market out good enough for the 8800 series cards. For the past several releases GPU's have been ahead of CPU's.


about damn time somebody finally understands this.


edit; my new thoughts and thinking over the thread title. it fits perfect.
Why? simple; in order to see the performance that should come out of your system with a GF8800 you do need some late and great hardware.... however most of us can't afford such, so we have to settle with middle of the road stuff; but it seems to suit us just fine.
November 29, 2006 4:09:24 PM

Quote:
omg. being bottlenecked by the processor? so it seems as if we do need CPU horsepower? god, I've only been preaching this for quite some time now :roll: :roll: :roll: :roll:


You are wrong and have been that for quite some time it seems unless you run things on lower quality settings.


Examine that article again. Play close attention to the 1600x1200 + resolutions for FEAR, doom3 and oblivion (basically games that tax the system) the difference between 26fps and 39fps in oblivion is the difference between playable and not playable. Higher resolutions still require a decent cpu.

I'm not saying the 8800GTX requires a QC cpu, however look at the differences between the lower end AMD and Intel CPU compared to better ones in their class. Quite the performance increase I might say.

so in the words of elmer fudd, .....shaddup rabbit.

and with that you just proved how wrong you are. There was very little difference in the review att 1600 x 1200. It states that above this cpu makes little difference.

But the review use 4xAA 8XAF low quality setting. If you use 16 X 16 and set other quality settings higher guess what?
Even at 1280 there is no difference using a amd64 50$ cpu or a 999$ quad core.

Get it? please try to think about it. If you use softshadows in fear and 16X 16X you get the same fps on a 5 year old athlon XP and a quad core.
Get it? try to think again. GPU limitation! If the gpu can´t handle it the cpu doesn´t matter at all. Get it?

Please list all the games you know of that are cpu limited and modern.
November 29, 2006 4:18:08 PM

Dude no cpu is future proof, look at the cpu changes in the past 1 year in a short few months quad and I just know AMD has somthing that is going to blow us away I just know it. The air has been very hush hush with AMD
!