Does a Graphics Cards Output Signal Have an Associated Refresh Rate?

Nov 3, 2014
4
0
4,510
I apologise in advance if my question already has an answer, I spent a reasonable time googling it but could not find any information. I am confused about the relationship between graphics cards and refresh rates, so I will put forth a few questions that I have and hopefully someone can see where my understanding is going wrong and enlighten me :).

Does a graphics card output signal have a fixed refresh rate, just as the monitor's display refreshes at a fixed rate? I know that a graphics card draws frames therefore producing a 'frame rate', but is it's actual output signal in sync with this? Or does it's output signal have a fixed refresh rate that results in frames simply repeating if the next one has not been drawn yet?

When you are playing a video game it seems like the graphics card simply tries to draw frames as fast as it can, so for example an old game like BioShock runs at over 400fps on my GTX 760, whereas BioShock Infinite will run between 50 - 100fps.

But what if you are watching a movie on your computer? Let's say the video file has a frame rate of 24fps (or 24p). Does the graphics card render the frames at exactly 24fps? And if so is the output signal also 24hz?
Or does the graphics card just render as many frames as it can like it does with video games, repeating superfluous frames until the next one comes along?

Also, does a LED monitor implement similar 'pulldown' techniques like TVs do to sync source frames with it's own refresh rate?

Many thanks in advance!
 
Solution

they have no correlation on there own. software like vsync or vertical sync is built to fix this vertical sync tells the gpu to only out put frames when the monitor is refreshing. and some "gaming" monitors have a pice of technology called gsync or graphic sync which tells the monitor only to refresh when it gets a frame from the gpu.
(gsync is proprietary to nivida and can only be used with nivida gpus)

mike1996

Reputable
Apr 6, 2014
343
0
4,860
Displays work on refresh rate which is how many times a second it will refresh or get a new image, a gpu runs with frames per-second and while playing a game will spit out as many frames as it can make completely untimmed with the monitor, this is way we have soft ware like gsync, free sync, and vsync.
gsync and free sync are built in to monitors to time the refresh rate to the gpu and v sync locks your frame-rate to you refresh rate. if you do not have any of this soft ware built in to your monitor or game that you are playing then the monitor will refresh the screen with part of an image at points when running a game this is referred to as tearing, when on part of an image is changed slightly and the other remains the same giving it the appearance that the image was torn.
when running a movie you are redrawing predetermined frames from your hard drive if the movie was rerecorded at 24fps then the gpu can only put that many out it cant make up new frames .
 
Nov 3, 2014
4
0
4,510
Thanks for the answers guys! Still a little unsure though, sounds like rolli59 is saying a graphics card output signal does have an associated refresh rate whilst mike1996 is saying they don't, they just output frames as they draw them. Bit more clarification would be much appreciated!

Cheers :)
 

mike1996

Reputable
Apr 6, 2014
343
0
4,860

they have no correlation on there own. software like vsync or vertical sync is built to fix this vertical sync tells the gpu to only out put frames when the monitor is refreshing. and some "gaming" monitors have a pice of technology called gsync or graphic sync which tells the monitor only to refresh when it gets a frame from the gpu.
(gsync is proprietary to nivida and can only be used with nivida gpus)
 
Solution
Feb 14, 2014
184
0
10,710
they output when they draw them and when the monitor is ready to accept them. Sometimes this causes issues known as screen tearing, the counter to screen tearing is v-sync which can cause some lag.

The new standard however in very new cards and monitors is Gsync, where cards will output as they draw them and give the monitor a variable refresh rate. Monitors are most typically at 60hz, 120hz and 144hz. Mike1996 is correct