Graphic Card Timeline? Need decent card for 5 year old games

PTNLemay

Distinguished
Sep 24, 2011
275
0
18,790
I'm looking to get into some old classics that I missed out on when they first came out. Titles include Half-Life, BioShock, Dead Space, Mass Effect, and eventually (if my graphic card can support it) their accompanying sequels. Now... I checked the system requirements for some of these but it seems like in only a few years all of the terms have completely changed. What I would simply like is some kind of graphical chart or timeline that shows the evolution of graphics cards from the 2 big name companies, and the associated terms. Ideally with a big red line to show when the reset the numbers and changed everything. I personally find it ridiculous and more than a little confusing that my 9000 is woefully outdated when compared to the 6000s.

Because, yeah, I have a HTPC with a small GPU-chip of some sort integrated onto the motherboard, my diag tells me it's a GeForce 9200. I also bought (upon the advice of a friend) a proper discrete graphics card that should work with my HTPC. It's a Radeon HD 5450. Just to give you an example the Dead Space system requirements are GeForce 6800 / ATI Radeon X1600 Pro (for minimum), or GeForce 8600 / ATI Radeon HD 2600 (for recommended). I usually check the Falcon Guide for these matters, but I'm not seeing any of the mentioned cards, even in the destitute section. I'm not sure how my graphics hardware stacks up to these requirements...

Oh, another thing, in case this helps. I'm on Windows 7 64-bit using a fairly standard dual core AMD Athlon II X2 220. I'm not sure but I assume this limits the kind of GPU I can plug into my Mobo? I used to also be limited by power requirements (hence why I went with the HD 5450) but I have since upgraded to a nice gold+ 500W PSU.
 

PTNLemay

Distinguished
Sep 24, 2011
275
0
18,790
Here's another example, I looked up the requirements for Mass Effect 2 (probably the latest and most graphics intensive game I'm considering) the recommended graphics mention a Radeon HD 2900. I have a Radeon HD 5450, several generations ahead of it it seems. But I looked closer and the 2900 has 700 million transistors, my 5400 has just under 300 million. The various generations overlap, in that the high-end stuff from a few years ago seems more powerful than the low-end stuff of today.

So when the recommendation say "For a smooth gaming experience we would recommend the 2900", how do I translate that into modern day cards? Should I just go by transistor count or is there more to it than that?

EDIT:
Oh... yeah. Actually that looks to be pretty close to what I was looking for, lol. Thanks.
 

PTNLemay

Distinguished
Sep 24, 2011
275
0
18,790
And is there a way to know approximately where a given CPU starts to bottleneck along these cards? Because I know that if you put a recent card in a build that's using an old CPU it can cause stuttering or something.

I realize I should probably just replace the CPU, but at that point I'd probably have to replace the whole innards, and I'd rather not burn through my savings too quickly. I have around 800$ saved up, so I could replace the whole mobo/CPU/RAM/GPU/OS if I really had to, but... if I can recycle more out of my current pre-built I will.
 

alrobichaud

Distinguished
Nov 9, 2011
796
0
19,060
[/quotemsg]EDIT:
Oh... yeah. Actually that looks to be pretty close to what I was looking for, lol. Thanks.[/quotemsg]


I am assuming that is because you noticed that the chart I linked lists both AMD and Nvidia cards from slowest to fastest. The HD 5450 is much lower on the list than the hd2900 even though it is a new card because it is much slower.
 

PTNLemay

Distinguished
Sep 24, 2011
275
0
18,790

I have a 1080p 60 Hz monitor, but I'm very willing to drop the resolution down to be able to game. That's what I did for Portal 2 and Starcraft 2. At full 1080p the computer was evidently struggling, so I dropped it to 1600 x 900. I can run both without much trouble now. Starcraft 2 runs at 30 frames-per-second when things stand still and drops down to 20 FPS when the screen is full of action, but yeah, so far no crashes.

And this is all before plugging in the HD5450, I'm actually running Starcraft II off of the Geforce 9200. Which shouldn't be possible, lol. I've been nervous about plugging in the HD 5450 though because I've never done that before. I'll try it this weekend when I get some rest and gather my courage.



Yeah probably, but I'll be patient, save my money. When I decide to upgrade to "real" gaming, I'll rebuild most of my computer I figure. For now I'll draw out this HTPC for as long as I can. If it's not broken don't fix it, and if it still runs keep it.