Educate me about refresh rate.??????????

Taruntj5

Honorable
Dec 18, 2013
47
0
10,540
So, I have a Sony 3d display which I use as a monitor and I think it has a refresh rate of 240Hz. Now, I know HDMI is only capable of 60Hz at 1080p, right??, so if i get higher frames in games over 60 than would it cause tearing?????????
BTW in windows it only shows 60Hz..
And one thing i've noticed that now when "sometimes" I turn v-sync on in my games my fps doesn't freeze at 60, it goes over 60 quite a time, so is it because my monitor has a higher refresh rate?? and if I had a 60Hz panel that it would freeze at 60????????
And last thing can I play games at 720p at 120hz over HDMI????

Thanks!!!
 
Solution
Monitor refresh rate and frames per second from the GPU are two different things.

The GPU will churn out as many frames as it can without vsync. Enabling vsync tells the GPU to try and limit the framerate to 60fps even if card can produce more than that (which is why you can still get more than 60)

The framerate you get doesn't care about the refresh rate of the monitor. Which is why the fps counter reports more than 60fps despite using a 60Hz monitor.

And the 60Hz monitor will only display the 60 frames that best meets every refresh.

For 120Hz at 720p, using DVI dual link or DisplayPort is a lot more reliable.
Hi there,

Refresh rate is the rate (in frames per second) that images are drawn to a display. Data is read from a frame buffer located on the signal source in real time regardless of whether or not a new image has been written to the frame buffer. Since the data is read from the frame buffer at a constant rate, the rate at which data is read from the buffer will often be different than the rate at which data is written to it. If the GPU's render pipeline does not write to the frame buffer at exactly the same rate at which the display pipeline reads from it the display pipeline will either read ahead of the render pipeline, thus reading data from the previous frame, or the render pipeline will write ahead of the display pipeline, thus overwriting data that hasn't been displayed yet. This is the cause of image tearing in video games when the frame rate is not some factor of the refresh rate. When V-Sync is enabled in a game (note that the v-sync option in rendering applications is distinct from the v-sync signal which is used to control the display) the game only updates the frame buffer during the vertical blanking period which is a short interval that occurs after the completion of every frame. The existence of this interval is mostly historical and is only necessary for compatibility but it exists none the less so I'll not go into it deeply. With V-Sync enabled, the frame in the frame buffer is guaranteed to be a complete frame for the duration of the display interval. This eliminates screen tearing but causes the render pipeline to stall, reducing peak frame rate and introducing some minor input lag.

When a video game reports a framerate it tallies this framerate by summing the number of successful frame update calls that are made over a one second interval. This is all done in software without regard to the rate at which those same frames are sent to the display. As I mentioned above, the render pipeline will stall if it gets ahead of the display pipeline, effectively limiting the GPU to one complete frame per refresh interval, or one frame ever 16.67 milliseconds with a 60hz refresh rate.

You are correct in noting that in general an HDMI link only provides enough bandwidth to run a 1920x1080 display at 60hz. HDMI 1.4 supports higher bandwidth, but this requires a compatible TV and GPU. It's a bit messy.

The 240hz that your TV is marketed as includes an internal refresh. Many displays do this to keep the image sharp and reduce ghosting. Some even provide interpolation to correct for motion blur, compression artifacts, and distortion. On Plasma displays the individual pixels lose their luminosity very quickly, so each one must be refreshed several times before a single frame has finished being displayed. My plasma display has an internal refresh rate of 600hz, or 10 full refreshes per frame on a 60hz input signal.
 

Taruntj5

Honorable
Dec 18, 2013
47
0
10,540


Thanks for reply.
But my questions are still unanswered
 
Monitor refresh rate and frames per second from the GPU are two different things.

The GPU will churn out as many frames as it can without vsync. Enabling vsync tells the GPU to try and limit the framerate to 60fps even if card can produce more than that (which is why you can still get more than 60)

The framerate you get doesn't care about the refresh rate of the monitor. Which is why the fps counter reports more than 60fps despite using a 60Hz monitor.

And the 60Hz monitor will only display the 60 frames that best meets every refresh.

For 120Hz at 720p, using DVI dual link or DisplayPort is a lot more reliable.
 
Solution