Why there is peformance difference, even when CPU is not utilizing itself in 100% ?

quitquitquit

Commendable
Jul 22, 2016
3
0
1,510
I was looking for answer, but I couldn't find any.
Basically - sometimes there is comparision between CPUs, im many games.
So they are testing a game, there is let say: i7/i5/amd. During game, GPU is 100%, but cpu 40/50/55% by average in all cores, but somehow, FPS is different. Like 70/60/50fps.
Why is that? Bad optimalization?
Is happening also for me, for example im playing Heroes of the storm, i have drops to 30fps from 60fps, but usage of my gpu is 60%, and cpu at 30% on all cores.(2700k/GTX580)
 
Solution
One possible explanation is that the game isn't distributing its workload evenly across all cores/threads. For instance, a game that can only utilize one thread and is severely bottlenecked by it would show up as 25% CPU utilization on an i5 and 12.5% utilization on an i7. If the i5 were newer than the i5 (better IPC from newer architecture), framerates would be much higher, or vice versa, despite neither being "fully utilized".

This is why AMD CPUs often perform poorly in games - 8 slow cores tend to fall behind 4 fast cores, because most of those cores sit idle.

IDProG

Distinguished
The concept is like this. In a game, a CPU is needed to produce random things (like pedestrians, enemy movements), and the GPU is needed to make everything (preset like the environment, random produced like pedestrians) visible and recognizable by human eyes. The faster the CPU produces things, the faster the GPU will be able to visualize the game and vice versa, IN NORMAL CASE. In other cases, a GPU might overpower the CPU. While the CPU is working at full speed at producing things, the GPU only needs a fraction of its power to visualize them, so the GPU will lower its power for the CPU to keep up. That's how bottlenecking happened, and that's why your GPU isn't in 100% usage. And yes, for now, it's hard for a game to utilize all cores of a CPU efficiently.
Sorry for my bad grammar.
Edit: Actually, your 2700K will never bottleneck your GTX 580. You may have either set your GPU to "Adaptive" mode in Nvidia Control Panel or had it preset before.
 

quitquitquit

Commendable
Jul 22, 2016
3
0
1,510
I understand bottleneck concept and role of CPU, this is why its quite strange for me, thats things like on that picture are happening:
https://dl.dropboxusercontent.com/u/39164584/usage.png
Same game, same run, different configurations, different fps, but...all CPU and GPUs are not even trying to work at 100%, or even close to this. Why is that? I know, where is frame limiter - it can happen. But when there is nothing like that? Why they are not trying to utilize full potential?
 

IDProG

Distinguished
Wow, I've never seen anything like that before! The test itself isn't even relevant, thanks to the different GPUs used in the test. The tester is not a professional tester who always have a 100% GPU usage. Better not believe his test.
 
One possible explanation is that the game isn't distributing its workload evenly across all cores/threads. For instance, a game that can only utilize one thread and is severely bottlenecked by it would show up as 25% CPU utilization on an i5 and 12.5% utilization on an i7. If the i5 were newer than the i5 (better IPC from newer architecture), framerates would be much higher, or vice versa, despite neither being "fully utilized".

This is why AMD CPUs often perform poorly in games - 8 slow cores tend to fall behind 4 fast cores, because most of those cores sit idle.
 
Solution