Why do higher resolutions show less variance in average game fps between different cpu's?

elitose

Reputable
Apr 20, 2014
103
0
4,680
I noticed that at 1080p there can be a fair bit of difference between the average frame rate of certain games depending on the cpu being used. However, at higher resolutions the cpu matters less and the difference of average frame rates get much smaller to being no different at all. If at 1080p and above the gpu is reaching full 100% usage whats going on with the cpu that makes it more influential on framerate at lower resolutions than higher ones? I'm just generally curious.

an example would be that of an i5 vs an i7. at 1080p the i7 may get 15-20 more fps in certain games, at 1440 only get 5-8 more, and 4k as little as 1-3 more. Yet, at all three resolutions the gpu is reaching 100% utilization.
 
Solution
At low resolutions, especially in multiplayer games, the CPU is usually the bottleneck, at least with upper tier cards... bump up to 4k and every card made today will struggle and even any pair of cards in SLI / CFX has trouble keeping up in many games,
At low resolutions, especially in multiplayer games, the CPU is usually the bottleneck, at least with upper tier cards... bump up to 4k and every card made today will struggle and even any pair of cards in SLI / CFX has trouble keeping up in many games,
 
Solution