Reviews always use low resolution on games when they are testing the performance of CPUs. Is there a simple explanation to this?
My assumption is that the CPU must have a higher usage at higher resolution in order to help the GPU.
Lowering the resolution of a computer game or software program increases the effect on a CPU. As the resolution decreases, less strain is placed on the graphics card because there are fewer pixels to render, but the strain is then transferred to the CPU. At a lower resolution, the frames per second are limited to the CPU's speed.
CPU sets the frame up + handles all the AI/resource allocation and then passes the parameters to the GPU which then draws the frame. So the CPU does its thing sends the frame along to the GPU. The larger the frame and the more processing required the longer it takes for the GPU to finish. At low resolutions the frames are drawn much much faster therefore the CPU has to do alot more work setting up more frames. Of course using vSync limits you to 60FPS so the CPU strain wouldnt change regardless of resolution provided that your GPU can handle 60FPS at the larger resolution.