Sign in with
Sign up | Sign in
Your question

FAQ: 8800 GTX/GTS CPU Bottlenecking Answers

Last response: in Graphics & Displays
Share
December 14, 2006 7:48:11 AM

I'm authoring this post because I'm about to go mad. This question has been asked more times in the past 3 weeks than all the ugly men in the world asking, "did you cum?"

ANSWER:

Getting an 8800 GTX or GTS will noticably increase performance regardless of what crap CPU you have. Of course, the faster the CPU the better performance. Duh. As many have said, ANY CPU will bottleneck the 8800 GTX, but this is no different from any other video card on the market.

So long as you have a PCI-E x16 slot and DDR400, you're going to be fine. But use common sense. If you're going to drop $650 on a GTX for a Pentium 4 3.0Ghz, you might want to rethink your investment.

And someone PLEASE sticky this.
December 14, 2006 11:04:21 AM

Quote:
ANY CPU will bottleneck the 8800 GTX, but this is no different from any other video card on the market.


so whats not going to bottleneck the 8800gtx? the equivalent of BlueGene stuffed inside a single peice of silicon?
December 14, 2006 11:50:45 AM

Intel Core 2 Quadro QC6700 overclocked to 4GHz

That should do the trick.


In other words, as mentioned before, no current CPU is powerful enough to extract the full potential of these new GPUs.
Related resources
December 14, 2006 9:46:04 PM

I dont worry anymore about my oce'd e6700.

I'll upgrade it in some years i think. 8)
December 14, 2006 9:51:50 PM

Hopefully this will get through some thick skulls.
December 16, 2006 6:07:14 AM

Acutally there is a bottle neck with CPUs, i thought the same as you guys that no matter what CPU you get or have it wont kill performence that much...WRONG WRONG WRONG!

I got an 8800GTS, E6400, 680I, 1GB 800mhz, for my next gen build, but before i put the build together i decided to see how my 8800GTS handled well with my single core 630 3.0ghz. terrible....i got 7690 in 3D5mark05 it went up a whole 600 POINTS from my original score of 7010 with my 7900GT.

Games ran just as bad. I tried COH achieved so-so with everything maxed. So i put the new build together and an 8800GTS paired with a E6400 kills i mean kills. i got 11005 in 3dmark05 and 7027 in 3dmark06. OCed my E6400 to 3.2Ghz easily with my 680I and guess what!!......on 3dmark05 my score jumped to 14630! 3630! POINTS just by OCing the processer i didnt even OC the 8800GTS i got 8160 in 3dmark06...as for games...


THEY RUN AWESOME on x8 AA and everything maxed out. MTW2 i can now play at huge settings with EVERYTHING on bloom, reflections, unit detail in highest, 8x AA, 16x AA i leave shadows on off and i achieve 60FPS always dipping the lowest to 30FPS but never any lower.

THE CPU DOES MATTER THG DID NOT LIE.
December 16, 2006 4:10:32 PM

SO if the CPU matter so much,

Is a X6900 and 7600GS better than a 3500+ and 8800GTS in games?
December 16, 2006 6:13:49 PM

Quote:
SO if the CPU matter so much,

Is a X6900 and 7600GS better than a 3500+ and 8800GTS in games?


I was aware we were talking aboutm 8800's and bottle necking, a x6900 + 8800 is ten times better than 3500+ + 8800
December 16, 2006 6:42:08 PM

I think mature drivers are going to bring the greatest performance improvements with the Geforce 8800GTX.
December 16, 2006 6:46:39 PM

Quote:
I'm authoring this post because I'm about to go mad. This question has been asked more times in the past 3 weeks than all the ugly men in the world asking, "did you cum?"

ANSWER:

Getting an 8800 GTX or GTS will noticably increase performance regardless of what crap CPU you have. Of course, the faster the CPU the better performance. Duh. As many have said, ANY CPU will bottleneck the 8800 GTX, but this is no different from any other video card on the market.

So long as you have a PCI-E x16 slot and DDR400, you're going to be fine. But use common sense. If you're going to drop $650 on a GTX for a Pentium 4 3.0Ghz, you might want to rethink your investment.

And someone PLEASE sticky this.


>> As many have said, ANY CPU will bottleneck the 8800 GTX,

Nope. Looking at cpu usage while playing oblivion clearly shows that my Core2 duo extreme is not a bottleneck for my 8800gtx. I am playing at 1920x1200 with all the graphics settings maxed out. Of course when I get my other 8800GTX back from RMA/repair and go SLI, it might be a different matter...
December 16, 2006 8:29:50 PM

Quote:
SO if the CPU matter so much,

Is a X6900 and 7600GS better than a 3500+ and 8800GTS in games?


I was aware we were talking aboutm 8800's and bottle necking, a x6900 + 8800 is ten times better than 3500+ + 8800
Please, go ahead and prove it.

And also, you were talking about a synthetic benchmark, which doesn't tell crap about gaming performance.
December 16, 2006 9:17:34 PM

You know what else bottlenecks the GPU?

Nothing bottlenecks the G80. The G80 is God. 95 fps in CoH at max settings at 1920x1200 on stock speeds, here I come! If you're willing to spend $600 on a GPU but not $200 on a CPU, that's your problem, not mine.

Why do you people have to come up with these debates for everything?
December 17, 2006 1:56:27 AM

Dude... you are WAY off base. The thing you either don't know or you're forgetting is that that CPU rendering test weighs heavily into 3DMark's score. In real life, CPU's don't nearly nearly that much work.

So that stupid score your quoting, it doesn't mean shite. OF COURSE your score is going to go up with a new CPU. I challenge you to review the SM2.0 and HDR/SM 3.0 tests and see how much of a difference there is between a P4 630 and your beloved E6600. I guarantee it'll be less than 10%.

And besides, like Prozac said, 3DMark is a mostly synthetic benchmark and should never be used to represent a cards entire (or even partial) performance. If it was such a great benchmarking suite we wouldn't benchmark games like Oblivion, Doom3, FEAR, etc.
December 17, 2006 2:54:54 AM

Hahaha, quite a war seems to be getting started here. It's like chucking rocks at a hornets nest. Why is it so hard to understand? Nvidia should work to make their drivers a little less cpu-intensive, but hey, nobody's perfect.
December 17, 2006 3:48:35 AM

My monitor runs at 60hz so 60 frames per second is all i need in a video card, tho it is a dell 2405 with 1920 x 1200 16:10 native resolution, so i need a card that can run that resolution at 60 fps with everything on, any stats above 60 fps and 1920 x 1200, i ignore!

I defy anyone to tell the difference between 60 fps and 65 fps or even 150 fps for that matter with Any First Person Shooter, cause altamately u only need those high frame rates for games like Fear.

unfortunately Toms and anyone else dont do benchmarks in widescreen resolutions so i have to make an esimate between 1600 x 1200 and the next 4:3 resolution

hopefully my 4600 wont bottleneck a GTS or GTX to much?
December 17, 2006 10:57:55 AM

Quote:
And besides, like Prozac said, 3DMark is a mostly synthetic benchmark and should never be used to represent a cards entire (or even partial) performance. If it was such a great benchmarking suite we wouldn't benchmark games like Oblivion, Doom3, FEAR, etc.

Yea, 3DMark measures the size of the e-penis. :roll:
December 17, 2006 7:24:49 PM

I agree completely. But the dude I was replying to was using 3DMark scores as "all inclusive" proof that CPU's make a huge difference in gaming.
December 18, 2006 9:29:40 PM

Im just ordering as we speak a quad extream 2.66 - G80 GTX - 4GB 800mhz ram (XP64) - and im using a dell 24inch widescreen.

So, will my GTX be 'bottlenecking' ?
December 18, 2006 10:32:27 PM

Could someone tell me why my cpu usage is only 80-95% in oblivion? I have a x1950 pro and only a single core 3800+ at 2.7 ghz.... this doesn't make sense to me. Also a little confused as to why my 3000+ x800xl system would run halo at 60% but this system runs it at a 100%.... did the same thing with Quake 4 but in reverse.
!