I've been looking for information about how the CPU will affect the performance of different video cards, specifically, as it pertains to gaming.
My current gaming system is a p4 2.0GHz with a 400MHz FSB with a GeForce4 TI4600 that I have had for about 2 years. I was thinking about buying a completely new system, something like a p4 3.2 800MHz FSB with the latest video card (FX5950 or RADEON 9800).
What I am curious about is how much CPU power is required to drive the latest video card? Obviously, I could just buy a faster video card and get some more FPS or better anti-aliasing.
Hopefully, I am making sense. So, if anyone has any links to articles that talk about this sort of thing. Or, if Tom *hint* would like to run a few tests to show the affect of the CPU on video performance, that would be great.
I realize there are other factors: AGP speed, amount of AI a game uses, etc. But, anything that would show some sort of comparison would really help.
The graphics forum would be a better place to post this question. There are a few guys who could really help. One of the things they would tell you is dont bother unless you are not happy with the way games are playing now. If you want more eye candy go with a 9800XT, and make sure it's a retailer who is giving vouchers for HL/2. Your current cpu does not offer too much of a bottleneck, and there is no option to get a better chip for your mobo. ( a faster P4a chip would be about the same ).If you have tons of cash, sure it's always a good time to upgrade. Remember however that both Nvidia and Ati have taped out new gpus, so we can expect new cards in a couple of months or less.
Can't believe it. Listen to Endyen. If you don't believe him do a test yourself with a friend who has a faster CPU or a slower CPU.
Download the free versions of 3DMark2001 (or later versions) and run tests yourself. Build basic unloaded windows partition. Borrow a spare hard drive or partition and dual boot ur drive to do this. Swap the card from one machine to another.
You will find the the benchmarking tool shows that the power of CPU is a fraction of "score". I'll give u a real life example. About a double in CPU Celeron 1400 to AMD2200 increased the score with the same card from 3000 to 3500. Less than a 20% increase for about 100% increase in CPU power. I'm not sure how that translates to FPS or other real change in performance but it is a measure of about how much difference you see in games and such.
Switching the graphics card to an ATI 9000Pro upped the score from 3500 to 7500 because that change also involved additional support for graphics features that didn't even run on the older card (like environment bump mapping).
In short, it's mostly the graphics card that makes the difference.
The loving are the daring!<P ID="edit"><FONT SIZE=-1><EM>Edited by Flinx on 01/18/04 09:13 AM.</EM></FONT></P>
Yeah, the first video card I bought was a GeForce MX440 or something like that. But when I would play a game or movie I would be using 100% of my cpu and the video would blip from frame to frame. Then I got a FX 5200(I know, horrible but it's they best they got in my little town) and now when I watch DVD's or AVI's I'm only using about -25 of my cpu. So I'd say it's definently all in the video card.