It really depends on your resolution. The CPU will hold back your FPS, but if your resolution is 1440p or higher, it'll still have lots of work to do keeping up with the added pixels.
Completely true.
Increasing the resolution makes the GPU work harder, which means it needs more data from the CPU. Because of this, I usually have users lower the resolution in games when I'm looking for a bottleneck on client machines. If you lower the resolution (decreases GPU load) and FPS/stutter does not improve, then it's not the GPU that's holding you back. Usually it's either a potato CPU or not enough RAM for modern games.
I have a 2500k clocked at 5ghz paired with a gtx 1070 reference and I don't physically notice a bottleneck. Playing games like GTA-V, DOOM and H1Z1 my cpu utilization wavers between 60 - 80% cpu load. For me it was a great investment and definitely revitalized my gaming rig. I get a steady 100+ FPS in all those games on ultra settings, with an occasional dip into the 80's. Because of that occasional fps dip, it's apparent the cpu is holding back the gpu, however I don't physically notice it. It was a good investment in my opinion for when I decide to do a cpu, motherboard memory upgrade.
It really depends on your resolution. The CPU will hold back your FPS, but if your resolution is 1440p or higher, it'll still have lots of work to do keeping up with the added pixels.
It really depends on your resolution. The CPU will hold back your FPS, but if your resolution is 1440p or higher, it'll still have lots of work to do keeping up with the added pixels.
Completely true.
Increasing the resolution makes the GPU work harder, which means it needs more data from the CPU. Because of this, I usually have users lower the resolution in games when I'm looking for a bottleneck on client machines. If you lower the resolution (decreases GPU load) and FPS/stutter does not improve, then it's not the GPU that's holding you back. Usually it's either a potato CPU or not enough RAM for modern games.