What is the difference between framerate (fps) and bandwidth (refresh rate)?

Noob333

Reputable
Nov 27, 2014
66
0
4,630
I have heard many things about this and have heard both bandwidth and framerate used in reference to video cards, cables, and monitors. Who knows exactly what the difference between the two is? Which is the correct term for what? One other kinda side question, now I have been hearing fps (frames per second) on games is like only determined by your graphics card is this correct? I don't really know what to think. I thought if my CPU was maxing out it could cause my fps to lower. Not really sure though. Sorry for bugging you guys so much. I really do try to look on the internet to some extent for solutions to my questions before coming here. It's just most the the people on here (even if I could find something on the internet) give way better responses and explain it better than any other website I could go to on the internet.
 
Solution
Well the frame rate is self explanatory, FPS = frames per second. Basically how many frames are being displayed per second.

The refresh rate is not the same as bandwidth. The refresh rate is how fast the monitor/display can change the image on the screen so if a display is capable of 60Hz that means the maximum it can display is 60 FPS.

Bandwidth is the amount of data that can be processed at one time. There are very different bandwidths for different parts.

What determines the FPS of a game is actually a culmination of everything. If the CPU and GPU can push it to 60FPS and the monitor is able to push 60Hz then it will do 60FPS. But is the display can do 120Hz but the hardware can only push 60FPS then the game will only run at 60FPS.
Well the frame rate is self explanatory, FPS = frames per second. Basically how many frames are being displayed per second.

The refresh rate is not the same as bandwidth. The refresh rate is how fast the monitor/display can change the image on the screen so if a display is capable of 60Hz that means the maximum it can display is 60 FPS.

Bandwidth is the amount of data that can be processed at one time. There are very different bandwidths for different parts.

What determines the FPS of a game is actually a culmination of everything. If the CPU and GPU can push it to 60FPS and the monitor is able to push 60Hz then it will do 60FPS. But is the display can do 120Hz but the hardware can only push 60FPS then the game will only run at 60FPS.
 
Solution

Noob333

Reputable
Nov 27, 2014
66
0
4,630
So Hz/refresh rate should be used for monitors. What about cables and graphics cards do you use the term hz to describe what fps they can handle or do you use framerate?
 
Cables, I have never really used a specific term to rate them except their version. Typically you have a few versions like DVI has DVI dual link and DVI single link. DVI dual link is capable of up to 2560x1600 60Hz while single link is capable of up to 1920x1200 85Hz.

HDMI is rated in its version so both the port and cable have to be capable. Right now the most common is HDMI 1.4, HDMI 2.0 is coming out as the next big standard for HDMI.

Same with DisplayPort. It has revision versions.