More resolution=less CPU usage

Solution


Sakkura pretty much got the gist of it from a high level perspective.

Video games use multiple procedures which are tightly syncronized to keep one part from getting ahead of the others.

Unless a game is purely software driven, almost all graphical tasks are delegated to the GPU(s). Key to your question is the raster process. Rasterization is the process of converting scene geometry, shadows, lighting, overlays, etc... to a bitmap that is then written into a frame buffer for eventual transmission to the display.

The entire render pipeline is far too complex to regurgitate here, so I'll focus on the pixel shader only. Pixel shaders are small programs that are run on the GPU at least once per pixel (multiple...
Yes and no.

Higher resolution means more work for the GPU. That means the framerate in a game will typically drop. That means the CPU will have fewer frames to work on, reducing its workload. But it's still doing exactly the same amount of work per frame, so if you're in a situation where you're CPU bottlenecked at either resolution, then CPU usage will not change.
 

Thanthos

Commendable
Mar 13, 2016
132
0
1,680


So i guess i should upgrade both.
 


Sakkura pretty much got the gist of it from a high level perspective.

Video games use multiple procedures which are tightly syncronized to keep one part from getting ahead of the others.

Unless a game is purely software driven, almost all graphical tasks are delegated to the GPU(s). Key to your question is the raster process. Rasterization is the process of converting scene geometry, shadows, lighting, overlays, etc... to a bitmap that is then written into a frame buffer for eventual transmission to the display.

The entire render pipeline is far too complex to regurgitate here, so I'll focus on the pixel shader only. Pixel shaders are small programs that are run on the GPU at least once per pixel (multiple passes may sometimes be necessary) in order to create the final bitmap. The number of pixels is a function of the resolution.

1280x720 = 921,600 pixels

1920x1080 = 2,073,600 pixels

2560x1600 = 4,096,000 pixels

and so on.

Older graphics cards used to have pixel shaders as a distinct hardware feature and their programability was limited. Modern graphics cards use unified shaders that are highly programmable, the same hardware handles all shader types.

Cranking up the resolution increases the minimum number of pixel shaders that need to be run, which increases the total amount of time that the GPU must spend on each frame before that frame is ready. Since the core components of a game engine are tightly synchronized, the CPU -- which is processing IO, physics, and game logic -- cannot be permitted to get too far ahead of the GPU. Often, the GPU is rendering a scene that is 1-2 frames behind that which the CPU is working on, and the GPU is displaying a bitmap that is 1-2 frames behind that which is being rendered.

Disabling some timing constraints can improve realtime responsiveness, but can cause undesirable side effects. For example, when Vertical Syncronization is enabled, the GPU will only copy a completed bitmap from the render pipe to the frame buffer during the vertical blanking interval (the period between when a frame has finished being sent to a monitor, and the next frame has yet to begin). Disabling Vertical Synchronization allows the GPU to copy the frame from the end of the render pipe to the frame buffer as soon as it is complete, overwriting the contents at the same time that the output driver is reading them. This can cause what is known as "screen tearing".

Doing the same on the CPU side may result in game logic executing so quickly that the player is unable to respond due to slow visual feedback. Alternatively, it may result in the game discarding work that it had just completed because the GPU cannot start the associated render work until the CPU has the next batch ready.
 
Solution

Thanthos

Commendable
Mar 13, 2016
132
0
1,680


wow.just wow.
Thanks for giving me all this information dude.
I also heard that with low resolution there's less gpu usage.
Isn't it a bottleneck?
 


It's mostly a case of "A team is only as fast as its slowest player". Give that player more work to do, and the rest of the team sits around twiddling their thumbs; give that player less work to do, and the rest of the team becomes more efficient.
 

Thanthos

Commendable
Mar 13, 2016
132
0
1,680


WAT?
 

Thanthos

Commendable
Mar 13, 2016
132
0
1,680


And what about temperatures? Does it affect CPU usage?
 


No, temperature doesn't affect CPU usage under normal circumstances (if the CPU gets too hot and has to throttle to lower clocks, that can muck things up).
 

Thanthos

Commendable
Mar 13, 2016
132
0
1,680


65 celsius is fine?
 

Thanthos

Commendable
Mar 13, 2016
132
0
1,680


Ik this question sounds retarded.
What about thermal paste? and how do i know if i have put it?(I don't really remember)
 

Thanthos

Commendable
Mar 13, 2016
132
0
1,680


Wait.
So if my CPU usage is like 90-100 and GPU usage is like 60-85 that's a bottleneck right? but i'm gaming on 1366x768 resolution.
So if i will upgrade the monitor i will get higher GPU usage but CPU usage will be the same?