Sign in with
Sign up | Sign in
Your question

Would an Athlon XP 3800+ 64 bottleneck an 8800GT?

Last response: in Graphics & Displays
Share
November 13, 2007 12:46:10 AM

My friend has an Althon XP 3800 64 and he is thinking about getting an 8800GT. He games mainly at 1680x1050.

At that resolution, would his CPU be bottlenecking the graphics card?
November 13, 2007 1:11:28 AM

An athlon xp 3800 64? It's either xp...or 64...unless you mean X2. And yes, it will bottleneck the video card.
November 13, 2007 1:13:31 AM

3800+ athlon 64

So it would be a bottleneck?
Related resources
November 13, 2007 1:16:02 AM

Yeah it's going to be a bottleneck. What video card does he use currently?
November 13, 2007 1:18:56 AM

definitely, guessing what you have, your system is ready to be forsaken and a new build done.
November 13, 2007 1:31:44 AM

He is using a 7800GT currently.


MY system is OK.


(3.0 E6850 @ 3.6ghz, 8800GT OC 675mhz) :) 
a b U Graphics card
November 13, 2007 1:31:51 AM

It's not that clear cut guys. Some games are more CPU bound, some are more GPU bound and there would be little bottleneck for him.

Fear is very GPU bound. Also, note how the 8800GT does in this review paired with numerous cpu's. Note in UT3, Crysis, and COD4, at 16x12 the 8800GT is the bottleneck at their test settings!
http://www.firingsquad.com/hardware/$500_gaming_pc_upgrade/page5.asp

If your friend has a Socket 939 mobo, he could put a X2 4200+ dual core on it for about $75 USD. But I wouldn't worry too much about the cpu bottleneck. Some games he will not get all the 8800GT has got, others he will. Overall I think it would be worth it for high detail 1680x1050 gaming to have the 8800GT (or upcoming HD3870).
November 13, 2007 1:38:45 AM

He is mainly interested in Ut3 engine games (Bioshock, Gears of War, Ut3) as well as Team Fortress 2.

As for MY cpu (E6850 3.0ghz @ 3.6ghz Conroe), do you think that will be OK for THE next big GPU release? You know, that mystical one teraflop nvidia card?
November 13, 2007 1:45:38 AM

That video card will probably be bottlenecked by every processor on the planet...if the "rumors" are right.

He will notice a performance increase, but it wouldn't be much since he will be cpu bottlenecked. The bottleneck will be less at higher resolutions since that depends more on the GPU and less on the CPU, but there will still be a bottleneck. I would say do a whole motherboard/cpu/ram upgrade, then worry about the video card.
a b U Graphics card
November 13, 2007 1:54:33 AM

Nightscope, did you look at that review I linked to? Notice the X2 4000+ keeping up with the C2 extreme 6800 paired with an 8800GT in all three games. The 8800GT was the bottlenck in average fps. I wouldn't doubt that the lows during actual gameplay would drop further on the slower cpu. If single core becomes the bottleneck in actual gameplay not 10x7 cpu scaling tests, then he could pop a $75 X2 4200+ into his rig if he has S939.
a b U Graphics card
November 13, 2007 1:58:26 AM

Yeah, I think your cpu will be fine for the next release. You have to keep in mind cpu scaling at low res vs actual gameplay where typically your settings are tweaked for some nice IQ and the GPU is stressed. Some games like rts or flight sims can be very cpu bound. Others, are beating up the GPU's at max playable settings.
a b U Graphics card
November 13, 2007 2:05:01 AM

Here's an example. We already saw above that when details are cranked and fsaa used, even at 16x12 the 8800GT was the UT3 bottleneck even with a X2 4000+.

But look at Anand's CPU analysis of UT3.
http://www.anandtech.com/video/showdoc.aspx?i=3127&p=2

AT 1024x768 with no fsaa, we can learn alot about the multicore, fsb, clock speed, and cache memory scaling in UT3. BUT who plays at 10x7 no fsaa on high end cards? What's it really matter if a 70% lead in 10x7 scaling tests amounts to a .4 fps lead in actual gameplay? CPU/System Bottlenecks(within reason) at max playable settings are overhyped IMO. Some games sure, most nope.
November 13, 2007 2:11:18 AM

full tilt graphic enable all hardware fuctions enabled even the 6600 has a hard time keeping up wiff that card anyone remember toms write up about this
November 13, 2007 2:18:26 AM

There comes a point to which getting a better processor doesn't help. With his current Athlon 3800+ processor, he will experience a bottleneck. He will also experience a bottleneck with the x2 4200, but it will not be noticeable. It depends on the resolution also.
a b U Graphics card
November 13, 2007 2:23:53 AM

I've long hoped that When retail crysis comes out I can find the time to test my FX-55 at a few clcok speeds vs. an X2 4200 at a few speeds to see if there is any difference at all at playable settings between Single core A64 and an X2. My goal was do do a few games, but we shall see as time allows.
November 13, 2007 2:28:18 AM

Well my evga 8800GT KO cannot be overclocked to SSC speeds. It crashses.

Gay.
November 13, 2007 2:30:15 AM

yes... even current C2D processors will bottleneck the 8800GT, again, largely depending on the resolution. as long as the framerates youre getting are playable, thats all that matters. example: an X2 3800+ @ stock is still more than fast enough to allow an 8800GTX to perform well at high resolutions for most games.

when someone says the cpu is bottlenecking the gpu, they shouldnt mean to where it will actually impact gameplay (though having an additional core to spare will smooth gameplay out some, other than that, no)... a slower cpu as such 'will' however bottleneck the gpu on the upper extreme of its fps, to where youll only be getting maybe 110fps, instead of 140fps with the fastest cpu... but neither of which have differences that are detectable to the human eye, also because your display would limit the framerate differences anyhow if you have vsync enabled (capped at 60fps usually).
November 13, 2007 2:31:53 AM

"even current C2D processors will bottleneck the 8800GT"

Are you serious? My uber 3.6ghz Core 2 Duo is bottlenecking my GT?
a b U Graphics card
November 13, 2007 2:32:06 AM

Oh, the 8800GTX in SLI did show a real world benefit to X6800 over FX-62, but notice in Carbon it needed no fsaa and motion blur disabled to be playable. Insane resolution for sure, but just shows the res and settings clearly change the bottleneck.
http://enthusiast.hardocp.com/article.html?art=MTI2Miw4...
November 13, 2007 2:35:55 AM

it is. look at the resolutions youre using for example... if youre using below 1920x1200 for example, youll probably be seeing your cpu bottlenecking your gpu... but above that resolution, your gpu will begin to bottleneck your cpu more, because thats a resolution where an 8800GT starts to taper off more, and youll be getting lower and lower fps.

the point is though, that oftentimes a cpu is the third thing to consider for gaming performance. first the gpu, then memory amount, then the cpu speed, then the memory speed, then everything else. it can differ from game to game as far as what takes priority, but thats a general rule of thumb for ideal gameplay typically.
a b U Graphics card
November 13, 2007 2:47:52 AM

EricVPI said:
"even current C2D processors will bottleneck the 8800GT"

Are you serious? My uber 3.6ghz Core 2 Duo is bottlenecking my GT?


Maybe some unmeasureable amount. :)  But If a stock speed X2 4000+ measures identical framerates in a game, I say no it is not. Some games yes, low res, yep. But most games at high res with details and eye candy cranked, nope. Take crysis. Crank it to true max details and AA/AF at 19x12, I don't think the equivelent of a 10GHz Quad core would give any noticeable or even measureable boost at all to the slideshow.

Don't get me wrong, I wouldn't go slap a second 8800GT (SLI) into an A 64 3800+, but I sure wouldn't hesitate to buy an 8800GT/ HD3870 over a X1950 pro/8600GTS either because of that cpu if I gamed at 16x10 or above.
a b U Graphics card
November 13, 2007 3:10:30 AM

choirbass said:
it is. look at the resolutions youre using for example... if youre using below 1920x1200 for example, youll probably be seeing your cpu bottlenecking your gpu... but above that resolution, your gpu will begin to bottleneck your cpu more, because thats a resolution where an 8800GT starts to taper off more, and youll be getting lower and lower fps.

How can you say that?

Crysis 16x12 (Just High details not even very high):
X6800 is identical to an e 4300 and beats a X2 4000+ by 0.3 fps
http://www.firingsquad.com/hardware/$500_gaming_pc_upgrade/page8.asp

UT3 16x12:
X2 4000+ beats the X6800 by 0.7 fps
http://www.firingsquad.com/hardware/$500_gaming_pc_upgrade/page5.asp

COD4 16x12:
The X2 4200+ beats the X6800 by 0.2 fps
http://www.firingsquad.com/hardware/$500_gaming_pc_upgrade/page5.asp


The Core 2 Extreme 6800 is way more powerful than the e4300 or X2 4000+, yet at 16x12 the 8800GT was the obviosuly the bottleneck in all three games as the performance was equal on both CPU's.



Quote:
the point is though, that oftentimes a cpu is the third thing to consider for gaming performance. first the gpu, then memory amount, then the cpu speed, then the memory speed, then everything else. it can differ from game to game as far as what takes priority, but thats a general rule of thumb for ideal gameplay typically.


I agree with that part and I don't think we are all too off in our thinking. But I sure don't agree that below 19x12 all of todays cpu's are bottlenecking the 8800GT. Sure older games it could scale at 19x12 depending on cpu. But that's is not a general rule of thumb by any means in modern games, especially not that will effect playable settings. For example, The 8800GT can't max 12x10 in crysis.
November 13, 2007 6:51:34 AM

yeah, for DX10 games such as crysis and bioshock, the demand on DX10 capable gpus is without a doubt more, so having much above 12x10 even isnt really plausible, especially on max settings, if you want your framerates to be somewhat fluid anyhow... as far as the 19x12 res though, that was mainly picked out due to where the gt and gtx start to deviate in framerates a lot more in dx9... before that point though, theyre pretty much identical for performance, within a few fps... and the 2 gtxs are the only cards faster than the gt

i agree with what youre saying though... im just pointing out that the cpu tends to have very little impact on the actual gameplay (you noted a difference of <1 fps in all 3 games, a gpu bottleneck in dx10, definetly)... unless youve got a cpu from 2001 or so... then your cpu is definetly a bottleneck at all resolutions, as would be your whole system most likely... ...but even single core cpus should be more than capable of handling a current game above 30fps at a somewhat high res even... looking at the xp 2500+ from an article thg did reviewing dated cpu performance in games (it was a s462 xp anyhow, if that wasnt the exact model)... even for as old as it is, it was still able to run fairly newer games at decent resolutions and framerates... the cpu was definetly a bottleneck, no doubt, but the framerates were still definetly playable too... and with as fast as cpus are now compared to then, cpu performance for gaming is more of a nonissue now even, too

however, with the introduction of realistic physics and ai and all that... then the cpu starts to play a more significant role... so, yeah

i guess its mainly the difference between dx9 and dx10 then more than anything... current gpus cant handle dx10 very well at all except at somewhat lower resolutions, but theyre excellent when it comes to dx9
November 13, 2007 7:03:21 AM

choirbass said:
yeah, for DX10 games such as crysis and bioshock, the demand on DX10 capable gpus is without a doubt more, so having much above 12x10 even isnt really plausible, especially on max settings, if you want your framerates to be somewhat fluid anyhow... as far as the 19x12 res though, that was mainly picked out due to where the gt and gtx start to deviate in framerates a lot more in dx9... before that point though, theyre pretty much identical for performance, within a few fps... and the 2 gtxs are the only cards faster than the gt

i agree with what youre saying though... im just pointing out that the cpu tends to have very little impact on the actual gameplay (you noted a difference of <1 fps in all 3 games, a gpu bottleneck in dx10, definetly)... unless youve got a cpu from 2001 or so... then your cpu is definetly a bottleneck at all resolutions, as would be your whole system most likely... ...but even single core cpus should be more than capable of handling a current game above 30fps at a somewhat high res even... looking at the xp 2500+ from an article thg did reviewing dated cpu performance in games (it was a s462 xp anyhow, if that wasnt the exact model)... even for as old as it is, it was still able to run fairly newer games at decent resolutions and framerates... the cpu was definetly a bottleneck, no doubt, but the framerates were still definetly playable too... and with as fast as cpus are now compared to then, cpu performance for gaming is more of a nonissue now even, too

however, with the introduction of realistic physics and ai and all that... then the cpu starts to play a more significant role... so, yeah

i guess its mainly the difference between dx9 and dx10 then more than anything... current gpus cant handle dx10 very well at all except at somewhat lower resolutions, but theyre excellent when it comes to dx9
I think what really is bottlenecking the cpu with those games are the physic of the game particle effects ect.... really who plays these kind of RES when online gaming is concerned
!