OK, quick question for everyone inclined to respond.
A couple of years back I got this rig that used to be a beast. Quad Core q6600, GTX275, 4GB of RAM and so on and so forth. At the time, no game ran above 1680X1050 (at least officially) and I could play any RPG on the market maxed out with fantastic framerates.
Speed forward to Year 10 and I find the Mafia 2 and Kane & Lynch 2 demos that I installed out of curiosity, to be horribly choppy and quite unplayable, compared to what I was used to in the past. Even Starcraft 2 gives me terrible FPS on Ultra settings...
So what I wanted to ask is this...can't I just use my rig to play everything maxed out at 1680X1050? Do I need to constantly upgrade to maintain a decent FPS in future games? Fallout:New Vegas will probably run perfect since it's the same engine as FO3 but then I think about DA2 and the Witcher 2 and I get upgrade anxiety.
It doesn't even make sense though, does it? It's the same settings...what changes?
The games are what's changed. Most game developers try to add more to new games, whether it be more highly detailed imagery, more imagery in general, or even both. The fact that newer titles may be doing more means your hardware has to work harder to keep up. Eventually, a PC either can't keep up or it can't maintain playable framerates without reducing the load by reducing the detail level.
A PC is nothing like a console in that PC games aren't being developed to run on the exact same hardware over and over again. Instead, PC games are often developed to push even the newest hardware, making older hardware truly struggle and sometimes falter.