I noticed that at 1080p there can be a fair bit of difference between the average frame rate of certain games depending on the cpu being used. However, at higher resolutions the cpu matters less and the difference of average frame rates get much smaller to being no different at all. If at 1080p and above the gpu is reaching full 100% usage whats going on with the cpu that makes it more influential on framerate at lower resolutions than higher ones? I'm just generally curious.
an example would be that of an i5 vs an i7. at 1080p the i7 may get 15-20 more fps in certain games, at 1440 only get 5-8 more, and 4k as little as 1-3 more. Yet, at all three resolutions the gpu is reaching 100% utilization.
an example would be that of an i5 vs an i7. at 1080p the i7 may get 15-20 more fps in certain games, at 1440 only get 5-8 more, and 4k as little as 1-3 more. Yet, at all three resolutions the gpu is reaching 100% utilization.