Sign in with
Sign up | Sign in
Your question

Why is bottlenecking bad?

Last response: in Graphics & Displays
Share
a c 130 U Graphics card
a b à CPUs
October 19, 2009 12:06:59 PM

From my understanding, bottlenecking is when your CPU can only send so much information to the GPU, which can processor more information than the CPU is able to send.

For a physical example, I run Tf2 @ 60FPS all low settings. I then upgrade to all high settings, and I get 60FPS still. (Just an example, I wish I did.)

But the fact is, I still get the 60FPS. That's way beyond the human's ability to see, and it plays smoothly.

So my question is; is the only reason bottlenecking is bad is because people want to see their frames at 120, and are unhappy if it's a constant 60?

More about : bottlenecking bad

October 19, 2009 12:20:44 PM

bottlenecking is the effect of any kind of flow being reduced by a narrow point on the journey.

it can be in traffic on the road or between any component in your pc.

bottlenecks in hardware are mainly caused by a miss match of hardware, or the inclusion of a slower technology.

eg

a single core CPU with a fast multicore GPU and slow system ram.
the CPU could not provide data fast enough for the GPU to be fully utilised and might lead the GPU to only run at 50% capacity.

or

fast quad core cpu with slow ram and slow hard drive.
the CPU might be capable of processing more data than it has access to from the RAM or the HDD.

m
0
l
a b U Graphics card
October 19, 2009 1:33:48 PM

Shadow, in the example you gave bottlenecking may not be a big deal at all. However, in many cases we are talking about much less FPS than that.

Also keep in mind, that bottlenecking is not something that ruins performance. But is something that wastes money. It is important becuse it wil make two GPU's of vastly different price points perform the same. Thus if you are bottlenecked there is no point in spending 400 on a GPU, when 100 will give you the same results.

Thinking about a bottleneck is more important from a value point of view than a performance point of view. Obviously if you can get your CPU to push 60fps at high settings you are gold, though it is not linear. There is a large transition region as the scaling of your computer starts to be robbed.

If you need more informatinon on system bottlenecks, let me know. They are fun stuff... though I could (and have) go on for days...
m
0
l
Related resources
a c 130 U Graphics card
a b à CPUs
October 19, 2009 3:08:48 PM

shadow187 said:
From my understanding, bottlenecking is when your CPU can only send so much information to the GPU, which can processor more information than the CPU is able to send.

For a physical example, I run Tf2 @ 60FPS all low settings. I then upgrade to all high settings, and I get 60FPS still. (Just an example, I wish I did.)

But the fact is, I still get the 60FPS. That's way beyond the human's ability to see, and it plays smoothly.

So my question is; is the only reason bottlenecking is bad is because people want to see their frames at 120, and are unhappy if it's a constant 60?



Your basic understanding is right but as daedalus685 is saying the technicalities of it are really quite complicated and involved. In your example for instance seeing 60 FPS after upgrading could be as simple as V-sync being used which if you don't know locks your frames at the refresh rate of your monitor, usually 60Hz so 60FPS.

dougie_boy's Example of a single core CPU teamed with a high end GPU is a good example of when an actual Bottleneck could happen. Most times what people are actually talking about when they say "Bottleneck" is a system restriction which is basically when the CPU is restricting the capabilities of the GPU as you describe in your assessment.

Mactronix

m
0
l
a c 130 U Graphics card
a b à CPUs
October 19, 2009 4:29:05 PM

Well, for instance (and this is reality), I'm getting a new computer that has an Athlon 64 x2 7550 (2.6ghz, I think). I'm also geetting an EXTREMELY good deal on a radeon 4890 2GB ($80). I know there is a bottleneck in those two, but I'll be able to max out L4D @ 1280x1024 (16xAA/AF), and still get playable frames, right?
m
0
l
a b U Graphics card
October 19, 2009 4:48:37 PM

Yes, but at that resolution the CPU will a large bottleneck. THough that doesnt matter if the 4890 is that low in cost.. Just know that you don't need nearly that much graphics power if the deal you are getting were to fall through.
m
0
l
a c 147 U Graphics card
a b à CPUs
October 19, 2009 6:17:12 PM

Your PC is full of bottlenecks, you can't avoid it. Your hard drive bottlenecks your RAM, your RAM bottlenecks your L3 cache(if you have it), your L3 bottlenecks your L2 cache, your L2 bottlenecks your L1 cache, etc, etc. And that is just data access.

Your goal is to minimize bottlenecks for best performance. Sometimes in order to do this it costs MEGA BUCKS. But you can minimize it on a more realistic level. Games tend to favor faster CPU, not so much multi core CPU's. So a fast dual core may suit you best. However, other apps like video encoding can take more advantage of quad core CPU's, in which case that may be a better choice. You have to get what best fits your needs. I think a Athlon 64 x2 7550 (btw it's 2.5GHz) will do fine with a 4890. If it were paired with a single core Pentium 4 your CPU wouldn't be able to process the information fast enough to feed the video card and you will see an increased bottleneck. You just have to have that balance.

BTW, is V-sync on as suggested above? Sounds suspiciously like it. usually when V-sync is off FPS bounces around ALL over depending on the scenes.
m
0
l
a b U Graphics card
October 19, 2009 6:23:29 PM

jay2tall said:
Your PC is full of bottlenecks, you can't avoid it. Your hard drive bottlenecks your RAM, your RAM bottlenecks your L3 cache(if you have it), your L3 bottlenecks your L2 cache, your L2 bottlenecks your L1 cache, etc, etc. And that is just data access.

Your goal is to minimize bottlenecks for best performance. Sometimes in order to do this it costs MEGA BUCKS. But you can minimize it on a more realistic level. Games tend to favor faster CPU, not so much multi core CPU's. So a fast dual core may suit you best. However, other apps like video encoding can take more advantage of quad core CPU's, in which case that may be a better choice. You have to get what best fits your needs. I think a Athlon 64 x2 7550 (btw it's 2.5GHz) will do fine with a 4890. If it were paired with a single core Pentium 4 your CPU wouldn't be able to process the information fast enough to feed the video card and you will see an increased bottleneck. You just have to have that balance.

BTW, is V-sync on as suggested above? Sounds suspiciously like it. usually when V-sync is off FPS bounces around ALL over depending on the scenes.


THe OP's example was figurative, V-sync has nothing to do with it at all.
m
0
l
a c 130 U Graphics card
a b à CPUs
October 19, 2009 7:55:41 PM

daedalus is right.

Also, daed, I will be getting playable frames, and that's all that matters. Especially as I increase the eye features.
Not to mention I might get a better 24'' monitor later down the road; especially when I get a job.
m
0
l
a c 147 U Graphics card
a b à CPUs
October 19, 2009 8:09:21 PM

I like jobs. haha... and 24" monitors.
m
0
l
a b U Graphics card
October 19, 2009 8:18:03 PM

jay2tall said:
I like jobs. haha... and 24" monitors.


I must say there are days I like my 24" more than my job.. lol.
m
0
l
a b U Graphics card
October 19, 2009 9:56:23 PM

Shadow, TF2 isn't much to worry about, but try a game like GTA IV, Crysis, WIC, etc.

We have something coming up on Tom's soon that fits into this topic, but you may want to check this performance comparison as it relates well to you. http://www.tomshardware.com/reviews/amd-cpu-overclock,2...

In it we paired 4 CPU's (including yours) and 2 graphics cards. It will give you a chance to see how the X2 7750 Kuma was holding back performance quite a bit compared to Phenom II processors. In some case, upgrading from an HD 4870 to the dual GPU HD 4870 X2 provided zero increase in performance because of the CPU. Now, UT3 was not an issue, like TF2 in your example all CPU's were playable. But look over WIC and Crysis, and the CPU held performance down around 30 average FPS. Doubling graphics power with the 4870 X2 then brought zero increase in performance, actually a slight decrease likely from the additional driver overhead of dual GPU's.



m
0
l
!