OK, quick question for everyone inclined to respond.
A couple of years back I got this rig that used to be a beast. Quad Core q6600, GTX275, 4GB of RAM and so on and so forth. At the time, no game ran above 1680X1050 (at least officially) and I could play any RPG on the market maxed out with fantastic framerates.
Speed forward to Year 10 and I find the Mafia 2 and Kane & Lynch 2 demos that I installed out of curiosity, to be horribly choppy and quite unplayable, compared to what I was used to in the past. Even Starcraft 2 gives me terrible FPS on Ultra settings...
So what I wanted to ask is this...can't I just use my rig to play everything maxed out at 1680X1050? Do I need to constantly upgrade to maintain a decent FPS in future games? Fallout:New Vegas will probably run perfect since it's the same engine as FO3 but then I think about DA2 and the Witcher 2 and I get upgrade anxiety.
It doesn't even make sense though, does it? It's the same settings...what changes?
A couple of years back I got this rig that used to be a beast. Quad Core q6600, GTX275, 4GB of RAM and so on and so forth. At the time, no game ran above 1680X1050 (at least officially) and I could play any RPG on the market maxed out with fantastic framerates.
Speed forward to Year 10 and I find the Mafia 2 and Kane & Lynch 2 demos that I installed out of curiosity, to be horribly choppy and quite unplayable, compared to what I was used to in the past. Even Starcraft 2 gives me terrible FPS on Ultra settings...
So what I wanted to ask is this...can't I just use my rig to play everything maxed out at 1680X1050? Do I need to constantly upgrade to maintain a decent FPS in future games? Fallout:New Vegas will probably run perfect since it's the same engine as FO3 but then I think about DA2 and the Witcher 2 and I get upgrade anxiety.
It doesn't even make sense though, does it? It's the same settings...what changes?