Thinking about how to predict performance with different resolutions, it hit me that essentially all a graphics card does is draw dots on the screen. So if we just figure out how many dots it has to draw, we can predict performance. For example, 1920x1080 is about 2 million pixels (or megapixels). 1600x900 is 1.44 MP So we might expect about 2/1.44 more frames per second dropping to 1600x900, assuming all settings are the same. 1920x1080 should have almost exactly half the framerates of 1366x768.
Benchmarks seem to agree with this. Is it really this simple?