Frames per second as a measure of performance

There have been a number of comments in the forums lately stating that frames/second is a flawed and misleading measure of performance. Up until the past year, I had always understood that framerate correlates perfectly with smoothness, but apparently that's not the case. THG's articles on micro-stuttering with multi-GPU setups demonstrated how you can have genuinely high framerates yet not have smooth performance.

I understand that 30fps doesn't necessarily mean each frame will take 1/30th of a second to render, that it's merely how long it takes on average, with some frames taking longer, and others being faster. Is that all there is to the argument, or is there more to it than that? I'm interested in hearing from people with knowledge on the subject, especially if you have a knack for explaining things in an easy-to-understand way :) So:

1. How is frames/second flawed as a measure of smoothness?

2. What alternatives exist?

3. Is it possible to quantify the results of those alternatives and plot them on a chart in the same way framerates are?

Thanks for reading, and any insight you can provide will be much appreciated.
 

Ravyu

Honorable
Mar 8, 2012
293
0
10,810
For one, fps is the easiest way to demonstrate the power of a GPU to the less technologically advanced people. You can't go around saying, "Oh, you're set-up will have 20% stutter if you go to that level of settings." Sure it isn't the most accurate, but it surely the most widely accepted one. If what you said is true, then Nvidia and AMD would have shown how well the card performs by how much stutter and smoothness is there in the first page, yet, they only show frame rates to prove how great their GPU's are. Only if you go in deeper in their sites, you will be able to find all the other stuff in it.

Although I don't really answer any of your questions, I thought it would be ok to post something like this here.
 

Sumukh_Bhagat

Honorable
Nov 11, 2012
1,524
0
11,960
My friend has a 6870. He plays games on highest possible and get around 25fps.
But when you'll see him playing without fraps you'll never able to guess whether it is 120fps or 20fps. Its so smooth I can't tell you, like Movies we see or when we see slow-mo movie (TimeWarp).

In the end I'd say, its not about fps its about smoothness.
 
Well that's what I'm talking about - framerate is meant to be a measure of smoothness. If it doesn't work, is there an alternative that gives a more accurate measure that could be quantified and used to compare hardware?
 

Sumukh_Bhagat

Honorable
Nov 11, 2012
1,524
0
11,960
I don't know but there should be something.

Like when I used to play games with my old card, they were low fps but smooth.
I saw a video in which a GTX 690 SLI even lagged a little??!! What is worth that much if you still lag!!! :na:
I liked you raised this matter here, thanks you just gave me a another way to look GPUs. :lol:
 

Sumukh_Bhagat

Honorable
Nov 11, 2012
1,524
0
11,960
When I first told people abut this smoothness and all, they laughed at me. But they should know if a game is played in 20s but smooth then even a 60fps will not beat it.

My old card plays hitman absolution at 8fps (You can laugh) but still its smooth, well hitman games are pretty slow-mo games. I played Dirt Showdown Splitscreen match and was getting 10fps, but very smoothly.
Now my new card plays game above 60 and smoothly.
 

Scott_D_Bowen

Honorable
Nov 28, 2012
837
0
11,060
Any of the links here will explain why FPS is a flawed metric:
- http://techreport.com/graphics/
- You want to know what the 99th percentile is (heavily weighted average in a way).
- You'll want at least one weighted average figure for the 2nd standard deviation.
- You'll want to know how many milliseconds in a given benchmark are spent rendering frames that took over 25ms to draw (40fps is a good minimum, which 'usually' equates to a 70fps average in today's titles).

I think this page is a good example of what I mean, even if it is Skyrim:
- http://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/9

In Skyrim (for example), the GTX660 Ti is +46.8% faster than the Radeon HD 7950 using a more weighted average:
- 17.3ms 99th percentile = >= 57.80 fps for 99% of frames drawn.
- 25.4ms 99th percentile = >= 39.37 fps for 99% of frames drawn.
... it's only ~7% faster averaged over a huge span of time!

EDIT: A GeForce 9400GT (256MB?) compared to a GeForce GTX660 (non Ti):
- http://www.hwcompare.com/13431/geforce-9400-gt-256mb-vs-geforce-gtx-660/
- It's a 10x to 18x times more powerful card.

Personally I believe that having a higher pixel rate, while not skimping on texel rate, helps with the smoothness.
- During heavy action having a high texel rate helps keep it playable.
- The ratio of texel rate to pixel rate depends on the resolution and even the aspect ratio (16:9 vs 16:10 would call for a +/-9% difference between the two).
- It is thus 'impossible' to make the perfect GPU so they will generally target a specific resolution: 1600x900, 1680x1050, 1920x1080 and 1920x1200.

If you read all the back back to the NVIDIA RIVA TNT/TNT2/Vanta articles you'll see good discussions on it.
The Radeon (7000 LE we call it now) had driver issues until very late in the game, so it took ATI until the Radeon 9000 Pro to really get their game on.... but when they did, they really did. (The first card to not really be slowed down when Anti-Aliasing was used at an awesome price to boot).


 
That looks like a best answer to me :) Thanks Scott - really appreciate you taking the time to explain this. The Tech Report benchmarking is really interesting stuff - hopefully we'll start seeing everyone using this soon. I actually owned a Radeon 9700 and it rendered black triangles over everything and put a green glow on some textures... but it was damned fast :)
 

Scott_D_Bowen

Honorable
Nov 28, 2012
837
0
11,060
Kudos all [:)]

Windows 8 has a new WDDM (Driver Model) which benefits AMD/ATI slighly more than it does NVIDIA.
- NVIDIA's kernel mode code was already 'too good' to begin with though.
- ... maybe so much so that is has security problems due to 'extreme' optimization?

They should both fork their drivers for WDDM v1.1 and above into 'business/secure' and 'gaming/extreme' IMHO.

I still think the NVIDIA cards have a slight advantage in 'smoothness' (say consistency of frame-render time between the current frame, and the last four frames, and the next four frames as a definition for that) over AMD/ATI.

The Radeon HD8800's with new drivers on Win8 x64 may totally change the game though!

More here: http://msdn.microsoft.com/en-us/library/windows/hardware/br259098.aspx

... and thanks to sam_p_lay for the discussion & raising the point.

Looks like AMD/ATI haven't been asleep at the wheel and Win7 customers may benefit!
- http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-frame-times
- Maybe they read this very thread?