When playing games on PC I like to get the most out of my rig: for instance, if your PC can handle high settings, I think it's a shame (and a wasteful use of the money you've spent on your rig) to play at medium settings. There is, however, one thing I care much more about than I care about details or AA: consistent FPS. That can make it tricky to settle on a particular combination of video settings. At times my gaming sessions tend to turn into an OCD-fest where I keep fiddling with the settings because of unexpected FPS drops. Sometimes I might just do the opposite and settle on lower details even though my rig might be able to handle higher settings, because I'd rather sit back and enjoy the game.
Let's take a look at a concrete situation: configuring Arkham City. To keep it simple, let's say I'm happy with constant 30 FPS; I can use Dxtory to cap the framerate. After some fiddling I try out a specific set of video settings by running the benchmark. The minimum FPS I get is 33 FPS. Hooray! That must mean I can play the game at those settings and never get less than 30 FPS, right? Nope. The thing is, some actual scenes in the game are more demanding than the benchmark, so the minimum FPS you get in the benchmark is not representative of the minimum FPS you can get in the game. Now, you might not find it a big deal if it's just one scene in the whole game, but if FPS drops occur regularly it can get pretty annoying and compel you to change the video settings again.
The bottom line is, I wish all game developers would include not only a benchmark, but also a scene in the benchmark which is as demanding as the most demanding scene in the game. That way you could get a reliable estimate of the lowest FPS you can get with specific video settings. You could configure everything and make decisions before starting a new game, and wouldn't ever need to go back to the settings menu.
What about you? How much do you fiddle with video settings and how do you decide at what settings you'll play a game?
Let's take a look at a concrete situation: configuring Arkham City. To keep it simple, let's say I'm happy with constant 30 FPS; I can use Dxtory to cap the framerate. After some fiddling I try out a specific set of video settings by running the benchmark. The minimum FPS I get is 33 FPS. Hooray! That must mean I can play the game at those settings and never get less than 30 FPS, right? Nope. The thing is, some actual scenes in the game are more demanding than the benchmark, so the minimum FPS you get in the benchmark is not representative of the minimum FPS you can get in the game. Now, you might not find it a big deal if it's just one scene in the whole game, but if FPS drops occur regularly it can get pretty annoying and compel you to change the video settings again.
The bottom line is, I wish all game developers would include not only a benchmark, but also a scene in the benchmark which is as demanding as the most demanding scene in the game. That way you could get a reliable estimate of the lowest FPS you can get with specific video settings. You could configure everything and make decisions before starting a new game, and wouldn't ever need to go back to the settings menu.
What about you? How much do you fiddle with video settings and how do you decide at what settings you'll play a game?