Sign in with
Sign up | Sign in
Your question
Closed

It seems my FX-55 is bottlenecking my HD3850

Last response: in Graphics & Displays
Share
January 16, 2008 5:09:24 PM

Does that seem right to you guys? cause i got a stock 3dmark06 score of 5400. Then i OC'd my FX-55 only 200mhz and got over 800 more points. What do you think is wrong?

More about : bottlenecking hd3850

January 16, 2008 6:16:59 PM

An FX-55 will bottleneck 3dMArk06, but it won't bottleneck gaming much as long as you're running games at high resolutions.
January 16, 2008 6:31:28 PM

Look at it this way:

At the resolution 3DMark06 runs at (default at 1280x1024) your CPU will not be able to keep up with the amount of rendered frames the GPU is cranking out.

Ex.

@1280x1024: GPU is putting out 100FPS - CPU only allows 70FPS - That's a 30% loss.

@1920x1200: GPU puts out 60 FPS - the CPU allows 55 FPS - Only a 9% loss.

So at a higher res, your CPU will be less of a bottleneck. At least, this is the way I understand it... Hopefully I didn't confuse you. :p 

Anyways, a faster CPU or an overclock should always net you a few more frames from your GPU.
Related resources
Can't find your answer ? Ask !
January 16, 2008 9:37:20 PM

Oh thanks strangestranger for making me feel welcome with my first post. Anyways to those of you who are being helpful, thank you for answering. So a score of ~6200 3Dmark06 points with a 3850 sounds right to you guys? it just seems like all of the reviews i've seen it has scored 8000 to 10,000 depending on the setup. So I'm guessing the Core2's are rocking my FX-55 as bad as it looks? :ange:  :p  haha thanks!
January 16, 2008 11:40:23 PM

Wow ok well thanks for your input. I'm guessing i need a larger heatsink to OC my FX-55 to the max for the time being, because a college student has trouble finding the money to get the HD3850, let alone a whole new build. Thanks to all who helped.
January 17, 2008 3:11:50 AM

gegt_canad ian, I had an FX-55 that was bottlenecking my 2900 XT severely. I had it OC'ed all the way to 3.2 ghz and it still did not help much. Do yourself a big favor, upgrade to dual or quad care. When I changed, my avg frames in CSS went up by 70. I play all games at 1600x1200.

BTW welcome to the froums and apologies for some rude ppl.
January 17, 2008 4:36:39 AM

yea ive been looking into it a lil bit, but an upgrade for me wont happen for a while, due to having pretty much no cash :p ...but thanks for your input, and what processor did u upgrade to from the FX-55, dirty_harry2?
January 17, 2008 5:02:46 AM

Do not judge your system by using 3dMark06 and other bencmarks. It only shows you how fast your system using that benchmark. It does not accurately shows how your system will handle specific peformance and games. Dual core would help, quad core will be pushing it since most games and application still don't fully use quad core chips.

A dual core would be a good improvement on your system. So I think you're good to go with just getting a new cpu and motherboard to go with the rest of your system now.

January 17, 2008 5:45:04 AM

Listen to chuck on this one.Your taking about a 1,200 point hit on 3d06 just because you have a single core proc(which is bullcrap).In the real world of FPS in games at good resolution,IMO,that 3850 is good match for your FX-55.
January 17, 2008 9:07:19 AM

I've felt that there should be a reliable listing of the CPU's that bottleneck specific cards at low resolution. I've been gaming with an Athlon X2 4600+ and a 7600GS at 1024 x 768 or 1280 x 1024, so no bottleneck, but I'm upgrading to a 3870 once income tax refunds arrive.

Because of the low res bottlenecking issues, I've also decided that I can justify a nice Viewsonic 20" LCD to replace the 17" Viewsonic CRT. That way, the CPU should be less of an issue. So, let's lobby for a Tom's Hardware CPU/GPU Bottleneck interactive chart!
January 17, 2008 1:06:59 PM

I upgraded to an AMD 6400+ gegt_canadian, hell of bang for the buck.
January 17, 2008 2:29:32 PM

Yeah, just search for a 939 dual core and then you'll be fine!

January 17, 2008 7:17:38 PM

ONce again: the FX55 will be bottlenecked in 3dMark.

In an actual game, at 1600x1200 or above, it's not going to matter much - if at all. The graphics card becomes the bottleneck.
January 17, 2008 9:01:20 PM

bottleneck. :x I'm just kidding, forgive me! :D 
January 17, 2008 9:27:34 PM

Well on a multi-threaded game you are gonna lose alot of performance vs a dual core CPU. I found my FPS more than doubled in some games (yes games, not benchies) when I got my E6600 over a 3700.
January 18, 2008 5:48:04 AM

There are only a handful of games that can multi thread. Oblivion, Supreme commander... maybe cutting edge titles like crysis. And then, the performance increase is usually around the 10% area (supreme commander being the exception to the rule)

If you saw huge gains, chances are it wasn't from multithreading.
January 18, 2008 1:21:27 PM

I hope my P4 531 3.0GHz can cope with the Radeon HD3870 512MB
hehehehehe... i just want to slowly migrate and later on this will follow you guys. I'm having trouble with my XPC SB81P supports single cores only...crazy
January 19, 2008 12:41:10 PM

Quote:
In an actual game, at 1600x1200 or above, it's not going to matter much - if at all. The graphics card becomes the bottleneck.


This is wrong! I play at 1600x1200 on all my games. Results speak for themselves. I had an FX-55 running with my 2900 XT, and when I upgraded to my 6400+, I went up in frames in all my games...

Now I know what you are thinking, faster processor, more frames right? Well I had my FX-55 OC'ed to 3.1 ghz. Also my 7950 GT was getting more fps than my 2900 XT. CSS I went up by 70 fps, BF2142 30 fps, crysis 10 fps, COD4 20 fps and DOW I don't know because it is capped at 60 fps on fraps. However I never dropped below 40 like I did on the old setup.
January 19, 2008 2:47:33 PM

What Mobo/and or chipset are you running. Nforce 4 will hold you back a little...so I have been told.

I base no evidence on that though.
January 19, 2008 9:01:44 PM

strangestranger

I did not mean to sound so forward in my correction of cleeve. I am going on all personal experience. I feel that the FX-55 was a bottleneck though. Please however prove me wrong, I need to learn more stuff. :D 
January 19, 2008 11:40:42 PM

If your on 939 the Optron 175 was like $100 online this week....and it will do 3-3.2Ghz on air @ stock volts...some run them up to 3.4Ghz with a volt boost and ram settings.
January 20, 2008 5:37:39 AM

cleeve said:
There are only a handful of games that can multi thread. Oblivion, Supreme commander... maybe cutting edge titles like crysis. And then, the performance increase is usually around the 10% area (supreme commander being the exception to the rule)

If you saw huge gains, chances are it wasn't from multithreading.

Yea it wasn't because of multi-threading, BF2142 more than doubled in FPS and that is single-threaded. I think my mobo had some serious issues, it did die a few months later (I think it died, something died anyway).
January 21, 2008 4:19:42 PM

Wow. ro3dog is the 2900pro supposed to be the same as a HD3850? yes i believe its an Nforce4. SLI 32x Asus board.
February 4, 2008 2:27:34 PM

Dirty_Harry2 said:
Please however prove me wrong, I need to learn more stuff. :D 


Well, it's hard to 'prove' you wrong when you haven't really provided evidence to support your own claims.

You said "when I upgraded to my 6400+, I went up in frames in all my games...".

Well, how many frames per second did you gain, and in which games?
February 4, 2008 3:08:58 PM

Quote:
wait, you overclocked your cpu and got more points in a benchmark, no ****.


Damn. You beat me to it.
February 4, 2008 4:03:52 PM

Canadian, do some looking online and see what your old stuff will sell for used. Upgrading might not cost as much as you think. I remember i had a POS Dell 4600 that I was parting out and some dude paid $90 for the motherboard. I took that money and bought a vastly superior board for no additional cost to begin my next build. 939 technology is quickly becoming harder to find, so you might be able to find someone who really wants an fx-55 and board and is willing to pay for it. A very good am2 board and dual-core cpu can be had for very little cash, less than $200 easy. Also, that DDR RAM will probably sell for more than it would cost to upgrade to 2 gigs of DDR2. You might be able to upgrade at no/minimal cost.
February 4, 2008 5:03:47 PM

I upgraded my 939 borad from a AMD 4000 to a AMD Opteron 185 dual core (same as a FX-60) last month. I use a 8800GTS 320 card.
My PC Mark 06 score is 9140 with a slight cpu over clock.
I got the cpu used for $150.00 and game play is great, even the Crysis demo.
I know the Intel chips are great bargains right now and are silly fast but I really don't want to deal with loosing all my info and re-installing windows if I install a new motherboard and Intel CPU.
I added a Zalman cooler.
If you are going to upgrade your 939 AMD cpu I recommend a dual core Opteron, if your not into over clocking the Opteron 185 is the fastest 939 CPU made. It will still over clock easily to 3 ghz or more and it will drop right into your motherboard. I didn't even have to upgrade my bios.
February 4, 2008 9:47:14 PM

well in CSS my avg frames were 91(which was 1 less than my 7950GT), and they went up to 162, Crysis was like 10 fps, and I went up to 20. Dawn of war, well I can't tell. BF2142 went up from 40 to 70 fps. COD4 went up 20 fps. I dunno.
February 5, 2008 1:33:50 PM

What resolution do you p[lay at?

At higher resolutions, those framerate increases don't jive with reality. If you got those increases at high res, something else changed, or something was wrong with your old system in the first place.
February 5, 2008 3:55:38 PM

I play at 1600x1200...maybe something was wrong with my old MOBO?
February 5, 2008 4:04:46 PM

I'd say so. At that res you should have seen a small difference with a processor swap, certainly not the huge increases you're reporting.
February 5, 2008 5:34:34 PM

Quote:
Cleeve, I dont know what your smoking, but its good stuff. A dual core processor will make a HUGE difference in multithreaded games over a single core. Oh yeah, and theres a lot more multithreaded games besides supreme commander, crysis, and oblivion. lol
It's not so much the addition of cores that make the difference, it's the swap of architecture. Going for 1 or 1.5 instructions / cycle to 2.5-3 will make a huge difference.

If you took a P4 and added another core (PD), the increase still wouldn't be outrageous, even in multi-threaded apps.
February 5, 2008 5:49:54 PM

Quote:
Cleeve, I dont know what your smoking, but its good stuff.


Your stuff is better.

The only game I've seen dual core make a HUGE difference in is Supreme Commander. Other than that, most 'dual core' optimized games will get less than a 10% boost in performance.

Not to mention, Counter-Strike isn't dual core optimized. I don't believe Dawn of War is either, and those are the games the OP listed that showed the hugest gains...

So I'll give you some of what I'm smoking if you give me some of yours. :) 
February 5, 2008 6:01:20 PM

Oblivion is a 'dual-core' optimized game. Here are some single vs dual core tests:

http://www.anandtech.com/printarticle.aspx?i=2747


Or if you want the raw numbers:

Dual Core: 39.4 FPS
Athlon64 X2 4200+ (2.2 Ghz, 512k cache x 2)

Single Core: 36.2 FPS
Athlon64 3500+ (2.2 Ghz, 512k cache)

About 9% difference... and that's at 1280x1024! A helluva lot lower res than 1600x1200, where the video card will be a lot more bottleneck than a CPU will. Sure as hell not the difference the OP saw, in his games that AREN'T dual core optimized...
February 5, 2008 7:10:12 PM

Fx55 is a single core processor. It will bottleneck in recently released games.
February 5, 2008 7:13:15 PM

cleeve said:
Oblivion is a 'dual-core' optimized game. Here are some single vs dual core tests:

http://www.anandtech.com/printarticle.aspx?i=2747


Or if you want the raw numbers:

Dual Core: 39.4 FPS
Athlon64 X2 4200+ (2.2 Ghz, 512k cache x 2)

Single Core: 36.2 FPS
Athlon64 3500+ (2.2 Ghz, 512k cache)

About 9% difference... sure as hell not the difference the OP saw, in his games that AREN'T dual core optimized...


Oblivion was in infancy of dual core optimized games. On paper it's dual core but in real gaming performance there is no difference between single or dual.
February 5, 2008 7:16:40 PM

cleeve said:
Your stuff is better.

The only game I've seen dual core make a HUGE difference in is Supreme Commander. Other than that, most 'dual core' optimized games will get less than a 10% boost in performance.

Not to mention, Counter-Strike isn't dual core optimized. I don't believe Dawn of War is either, and those are the games the OP listed that showed the hugest gains...

So I'll give you some of what I'm smoking if you give me some of yours. :) 



Not true at all. In recent games it makes much as 50% more or less. Of course that would depend on the game and resolution.

http://www.gamespot.com/features/6177688/p-7.html
Take bioshock for instance. More than 50% difference.

http://www.gamespot.com/features/6183967/p-5.html
Call of duty 4... Nearly 100% difference at high detail settings.

http://www.gamespot.com/features/6182806/p-6.html
Crysis is more gpu limited but it shows over 30% increase even in high resolutions.
February 5, 2008 7:19:43 PM

marvelous211 said:
Oblivion was in infancy of dual core optimized games. On paper it's dual core but in real gaming performance there is no difference between single or dual.


Yes, but the OP listed CSS and Dawn of war - not dual core optimized at all, to my knowledge.

In any case, other than Supreme Commander, I've yet to see a dual-core optimized game that's more bottlenecked by the CPU than the GPU. But I'l look around... maybe Crysis will surprise me.


February 5, 2008 7:21:15 PM

reread my above post. Edited for you to compare.
February 5, 2008 7:41:21 PM

I'll admit I'm pleasantly surprised at the amount of increase dual core shows in these new titles, I didn't think it'd be that dramatic.

However, at a decent resolution like 1600x1200 - the res the OP was talking about - I'm still going to maintain that the bottleneck shifts more to the graphics card, not the number of cores.

Crysis:
single core 2.4 GHz: 18 fps
dual core 2.6 GHz: 25 fps

This isn't much of a difference, especially when you take into account the .2 GHz faster the dual core is running.

Bioshock:
single core 2.4 GHz: 44 fps
dual core 2.6 GHz: 61 ps

Bioshock shows a surprising leap, I'll admit, even taking into account the .2 GHz speed deficit.

But check the last page, the OP is reporting a 72 fps increase in CSS (I don't believe that's dual-core optimized), a 30 fps increase in Dawn of War (also not dual-core optimized).

That's the result of single-core bottlenecking at 1600x1200. Something else is going on.

February 5, 2008 7:44:49 PM

Quote:
Crysis:
single core 2.4 GHz: 18 fps
dual core 2.6 GHz: 25 fps

This isn't much of a difference, especially when you take into account the .2 GHz faster the dual core is running.
I have to side with cleeve here. That FPS jump could have probably been achieved with a 200 Mhz OC on the single core, or at least would have closed the gap between the two.

Quote:
But check the last page, the OP is reporting a 72 fps increase in CSS (I don't believe that's dual-core optimized), a 30 fps increase in Dawn of War (also not dual-core optimized).

That's the result of single-core bottlenecking at 1600x1200. Something else is going on.
Games on the source engine are going to benefit more than most other games from a faster CPU. That being said, I managed to jump from 38 FPS at 1280x1024 (Max Settings) to 120 FPS (Max Settings) simply by changing from a PD 8xx @ 2.66Ghz to an Athlon x2 5000+ @ 2.66Ghz. There may be some merit there, but again... at that resolution, a leap of that magnitude questionable.
February 5, 2008 7:49:51 PM

rgeist554 said:
Quote:
Crysis:
single core 2.4 GHz: 18 fps
dual core 2.6 GHz: 25 fps

This isn't much of a difference, especially when you take into account the .2 GHz faster the dual core is running.
I have to side with cleeve here. That FPS jump could have probably been achieved with a 200 Mhz OC on the single core, or at least would have closed the gap between the two.


I call BS since Crysis is GPU limited than anything else. Even if you overclocked 200mhz it would show 1 fps increase.

http://www.gamespot.com/features/6182806/p-6.html
Tested with a gtx @ 1600x1200 high settings.

Athlon 64 FX-60 @ 2.0 GHz 24fps
Athlon 64 4000+ @ 2.4 GHz 18fps

Single core is clocked higher while FX-60 which is dual core was underclocked. That is 33% difference with 400mhz lower clocked dual core even in GPU limited situations.
February 5, 2008 7:55:45 PM

cleeve said:
I'll admit I'm pleasantly surprised at the amount of increase dual core shows in these new titles, I didn't think it'd be that dramatic.

However, at a decent resolution like 1600x1200 - the res the OP was talking about - I'm still going to maintain that the bottleneck shifts more to the graphics card, not the number of cores.

Crysis:
single core 2.4 GHz: 18 fps
dual core 2.6 GHz: 25 fps

This isn't much of a difference, especially when you take into account the .2 GHz faster the dual core is running.

Bioshock:
single core 2.4 GHz: 44 fps
dual core 2.6 GHz: 61 ps

Bioshock shows a surprising leap, I'll admit, even taking into account the .2 GHz speed deficit.

But check the last page, the OP is reporting a 72 fps increase in CSS (I don't believe that's dual-core optimized), a 30 fps increase in Dawn of War (also not dual-core optimized).

That's the result of single-core bottlenecking at 1600x1200. Something else is going on.


Ha.... Look again. There is benchmark of FX-60 underclocked @ 2.0ghz.

What you are trying to explain to me is about GPU limited situations but it does make a difference. Now that dual cores are optimized and geared towards it. Today you need more umph than a single core.
February 5, 2008 8:07:50 PM

Yep, pretty much the same results, demonstrating a .2 or .4 clockspeed difference is neglegable.

As i said before though, I'll admit I'm surprised at the amount of increase dual core shows in these new titles. I hadn't seen a new comparison and had based my opinion on old info.

So thanks to Marvelous for providing those benches and setting me straight.

On a side note, it looks like Dawn of War *is* dual core optimized. I'm still not convinced it accounted for a 30fps difference at 1600x1200 though.

However, this still doesn't account for a 72-fps difference in a single-core app like CSS, especially when the fellow had his single-core FX-55 CPU overclocked to 3.1 GHz. I still think something funny was going on with that fellow's machine.
February 5, 2008 8:12:49 PM

Quote:
Ok you guys are getting off the question. For you slow people, the question was: "Is my single core processor bottlenecking my HD3850"? The answer is clearly YES.


We have few people here who think it doesn't make that much of difference between single core and dual core. Like you said though it does make a huge difference especially games that aren't gpu limited where your gpu is running out of steam to feed the processor.
February 5, 2008 8:14:40 PM

cleeve said:
Yep, pretty much the same results, demonstrating a .2 or .4 clockspeed difference is neglegable.

As i said before though, I'll admit I'm surprised at the amount of increase dual core shows in these new titles. I hadn't seen a new comparison and had based my opinion on old info.

So thanks to Marvelous for providing those benches and setting me straight.

On a side note, it looks like Dawn of War *is* dual core optimized. I'm still not convinced it accounted for a 30fps difference at 1600x1200 though.

However, this still doesn't account for a 72-fps difference in a single-core app like CSS, especially when the fellow had his single-core FX-55 CPU overclocked to 3.1 GHz. I still think something funny was going on with that fellow's machine.


yeah it just depends on the game. Just 2 years ago we barely had any dual core optimized games but ever since xbox360 and ps3 it started to kick up a notch. Long as it's not a gpu limited situations it does make a difference.
February 5, 2008 8:22:19 PM

Just some food for thought... :p 

P4 @ 2.8Ghz vs. PD @ 2.8Ghz:
http://www23.tomshardware.com/cpu_2007.html?modelx=33&m...

There will almost always bee a performance increase between single and dual-cores, but in some cases (as shown above) this difference is minimal.

Then there are cases where dual-core optimization is key, such as:

E4300 @ 1.8Ghz vs. P4 @ 2.8Ghz:

http://www23.tomshardware.com/cpu_2007.html?modelx=33&m...

There is almost a 400% increase in FPS on the dual-core over the single-core chip. (This is a kind of extreme example though)
February 5, 2008 9:08:46 PM

rgeist554 said:
Just some food for thought... :p 

P4 @ 2.8Ghz vs. PD @ 2.8Ghz:
http://www23.tomshardware.com/cpu_2007.html?modelx=33&m...

There will almost always bee a performance increase between single and dual-cores, but in some cases (as shown above) this difference is minimal.

Then there are cases where dual-core optimization is key, such as:

E4300 @ 1.8Ghz vs. P4 @ 2.8Ghz:

http://www23.tomshardware.com/cpu_2007.html?modelx=33&m...

There is almost a 400% increase in FPS on the dual-core over the single-core chip. (This is a kind of extreme example though)



You put up some benchmarks of 4 year old game (not dual core optimized) trying to differentiate between dual or single core? :non: 

Dual core doesn't always make a difference. The code has to be optimized for it to take advantage of dual core or even quad core processors.
!