How to determine the fps a monitor can reach?

gamerxavier

Distinguished
Apr 4, 2011
541
0
19,010
I briefly read a while back about monitors and the theoretical FPS they can reach. They included details as such it depends on what your running and things like that.

With that in mind. The forum never gave much detail as to how to figure out the FPS.

So, based on my system and my monitor the EW224T-PN LG Flatron
What would my fps max out on that monitor, and if i upgraded gpu.

I almost always used vertical sync and kept it at 60. But, I'm curious to see if i can personally see a difference between 60 FPS and 100 or so. Likely won't matter much if it's 60 to 75 at most but wouldn't be terrible to try. Nonetheless, If someone can use my system and monitor to give details as to how to determine the fps a monitor can reach that'd be really awesome something I've wondered for quite a while.
 
Conventional LCD monitors refresh at 60 Hz . This is the same as 60 fps .

By running vSync you are limiting the graphics card output to what the monitor can actually display . This used to be important so the monitor was not damaged . Thats still true if you are using some TV's .
If you disable vSync you wont get more fps actually on the screen
 

gamerxavier

Distinguished
Apr 4, 2011
541
0
19,010



odd the forum said otherwise about the 60hz and 120hz stuff. it made sense but i guess that forum just confused me even more. Nonetheless it was a tadd old forum so can't say much.

Edit: the forum i read spoke about how 2ms and 5ms etc affected the FPS the monitor can reach.
 
The refresh rate of 2ms or 5 ms etc
is the time the monitor [theoretically] takes to change between one frame to the next

Obviously there are 1000 milliseconds in a second
Since a 60Hz monitor is capped at 60 fps it can actually display , but a faster refresh rate can mean theres more time displaying the actual frame and less time transitioning between frames

In real life I cant see any difference between 8 ms and lower . Others may have an awareness of it but I doubt it .
And then of course the refresh rate they quote is measured in different ways by different manufacturers , and usually a lot higher than the theoretical minimum .
Ideally LOOK at an actual screen and see if you like it

Also keep in mind that movies, dvd's and blurays play at 24 fps , and tv is probably the same [ but varies in different countries with different technologies ]
 

michaeljhuman

Distinguished
Jul 28, 2012
100
0
18,690
Movies do play at 24fps, but (it used to be) that TVs did not. Which led to a whole set of issues related to trying to display 24fps at interlaced or progressive rates (which were like 60 and 30 as memory serves.)

BUT there are some TVs that can display at some handy interval which lets movies play back more naturally.

It's a an interesting topic, if you like Audio/Video anyway :)

My question, is whether 120hz monitors smooth out game play even if you can't manage at least 120fps. But I don't want to hijack your topic. sorry :)

 



Different TV technologies like PAL , or NTSC .....

Once I get to 40 fps its very hard to detect any difference IMO , so if Im getting 40 FPS minimum I think I have the image details abut right
 

dingo07

Distinguished
Well, the first part of the equation is- it depends if the source has been recorded with film (cinemascope, panavision, imax) or video (iDevice, digital video camera, etc.) or digital rendered to a given frame size@FPS (3dsMax, Avid, ILM, etc.)
Then the processing ability of the player and video come into play in making it seamless. If the bitrate is higher than either can transcode, you get all kinds of artifacts. You see this regularly with HD streamed from digital cable and satellite services. But satellite is a bit of another discussion because the signal has been squeezed to go through the air, a lot more than cable has been squeezed to go through the wire.
Here is where the speed or access time of the display comes in. You're much less likely to see artifacts from a display that is 5ms compared to 12ms.