NVIDIA vs. 3Dfx - TNT vs. the Voodoos

3D Benchmarking And What Is Really Important For Us

I had a meeting with an old friend from the graphics business today. We spoke about the good old time when my website was still very young and I came up with the 'Monster Truck Madness 3D Benchmark' (Nov. 3, 1996), as some of my long-term readers may still remember. He said that this was the first time that a game was used for benchmarking and when he went to magazines and told them to use it, he was asked by the editors "what? You want to use a game for benchmarking?"

A lot has changed since then. Many games include some way of benchmarking, Quake was and now Quake2 is the most known one. All those game benchmarks have one problem however, they tell us the average frame rate over a certain amount of played frames or after a certain time of demo play. Since the release of Voodoo2 we got pretty used to frame rate numbers in the range of 50 to even over 100.

However everyone knows that the eye needs only 25 fps to see a scene as a smooth movement, movies are played at 24 fps and everyone seems to be happy with it. The need for frame rates way above 25 fps was justified with the theory that in game play the frame rate could drop down to 10 fps for only one second, thus hardly changing the average frame rate reported by a benchmark, but making the game unplayable for this moment. This is why frame rates of 35 fps were rated as not good enough, or why a card that does 70 fps instead of 35 fps is regarded better.

If we think about it for a second, we can see that this way of rating the 3D cards is not quite correct. Couldn't a card that is scoring an average of 70 fps be running at only 15 fps in some scenes, but equalizing this by running at some crazy 125 fps in another scene? Couldn't a card that does 35 fps be only varying frame rate between 30 and 40 fps and thus be even better than the 70 fps product? The fact of the matter is that we shouldn't care less about the wonderfully reported average frame rate.

What we really need to know is the minimal frame rate in a game and if possible we should see how the frame rate changes in game play. Luckily I found a partly solution to that. I've now got software that saves the frame rate for each frame of a demo, so that you can analyze it later. For the first time I was able to see a graph that shows how the frame rate changes from frame to frame.

Unfortunately I can only do that with Direct3D games as yet, but I hope that I will be able to do this for OpenGL games soon too.

  • Pure nostalgia. Tom's Hardware is awesome for keeping 14 year old articles around, I read the whole thing and it brought back memories.
    Reply
  • xkm1948
    Ahh now this brings back some old memories.
    Reply