There have been a number of comments in the forums lately stating that frames/second is a flawed and misleading measure of performance. Up until the past year, I had always understood that framerate correlates perfectly with smoothness, but apparently that's not the case. THG's articles on micro-stuttering with multi-GPU setups demonstrated how you can have genuinely high framerates yet not have smooth performance.
I understand that 30fps doesn't necessarily mean each frame will take 1/30th of a second to render, that it's merely how long it takes on average, with some frames taking longer, and others being faster. Is that all there is to the argument, or is there more to it than that? I'm interested in hearing from people with knowledge on the subject, especially if you have a knack for explaining things in an easy-to-understand way So:
1. How is frames/second flawed as a measure of smoothness?
2. What alternatives exist?
3. Is it possible to quantify the results of those alternatives and plot them on a chart in the same way framerates are?
Thanks for reading, and any insight you can provide will be much appreciated.
I understand that 30fps doesn't necessarily mean each frame will take 1/30th of a second to render, that it's merely how long it takes on average, with some frames taking longer, and others being faster. Is that all there is to the argument, or is there more to it than that? I'm interested in hearing from people with knowledge on the subject, especially if you have a knack for explaining things in an easy-to-understand way So:
1. How is frames/second flawed as a measure of smoothness?
2. What alternatives exist?
3. Is it possible to quantify the results of those alternatives and plot them on a chart in the same way framerates are?
Thanks for reading, and any insight you can provide will be much appreciated.