Sign in with
Sign up | Sign in
Your question
Closed

Question about FPS and benchmarks

Last response: in Graphics & Displays
Share
September 9, 2010 5:51:28 PM

This may be a dumb question, so be gentle if it is...

In the benchmarks, they provide an FPS # for various game simulations.

LCD computer monitors refresh at either 60 or 120 Hz.

If you have a video card that runs a game at 89 frames per second and a monitor that refreshes 60 times per second, are some of the frames from the gpu lost?

If on the other hand, the gpu is only able to achieve 40 FPS, I assume there will be times when the image doesn't change for a refresh cycle?



So, if your monitor is 60 hz, is there any advantage to being able to display a game at 89 FPS (or whatever number)?

thanks

September 9, 2010 6:05:55 PM

I have no clue, really, but I see no replies so I am n ot going to keep my mouth shut. I would assume it'd drop the frames. (btw they also run at 70 and 75 hz) but I ran medal of honor airborne with 3000+ frames (????) no clue how. you could just turn up AA AF AS, ect and run at a lower frame rate.
Score
0

Best solution

a c 125 U Graphics card
a b C Monitor
September 9, 2010 6:13:23 PM

You're pretty much right, the extra frames are "lost". This is basically why Vsync was invented. It will scale back the output to matchyour monitor's refresh rate. The issue here is that it can sometimes drop down, say to 55fps, even tho with Vsync off it might run at an average of 90fps. Generally you won't notice that but it's a bit odd.

There's a couple downsides to outputting way more FPS than you can display. The first is unecessary GPU usage. More power and heat going into it. The second and far more noticeable is screen tearing. Basically one part of the screen is being refreshed at a different time than something else, so you get a big line/tear through the screen.

Ideally, you run at your monitor's refresh rate, generally 60hz but like atotalnoob said they make 70, 75 and even 120hz (for 3D) monitors.
Share
Related resources
a b U Graphics card
a b C Monitor
September 9, 2010 6:16:50 PM

60 Hz means the monitor is capped at 60fps for receiving the frames, though the GPU its self can do better, so yeah. Though the difference between anything above 45fps isn't noticeable.

Note that some "120Hz" TVs aren't really "120Hz" it simply doubles every frame on a 60Hz output.

Score
0
September 9, 2010 6:23:36 PM

thanks for the feedback. good answers.
Score
0
September 9, 2010 6:23:45 PM

Best answer selected by 23mike.
Score
0
a c 271 U Graphics card
a b C Monitor
September 9, 2010 6:32:54 PM

This topic has been closed by Mousemonkey
Score
0
!