Can Your Old Athlon 64 Still Game?

It is always fun to see how the latest and greatest hardware performs, but not all gamers have cutting-edge machines or the means to upgrade every time a faster graphics card hits the streets. Many gamers want to know how to best squeeze more life out of their rigs—or whether it’s even possible.

Very affordable PCI Express gaming cards are all over the place now, but are any worth putting into an aging machine? Will that upgrade alone allow you to play the latest popular games at high details? How much CPU do you need? Does a single-core CPU still have what it takes? Will any dual-core chip suffice, regardless of its clock speeds?

We try to answer every single one of these questions as we take an aging—yet hopefully capable—gaming box, put in a couple of affordable graphics cards, and test its performance in some modern games with an older single-core CPU and with two dual-core processors.

Our main focus today is on AMD Athlon 64 systems, built on a Socket 754, Socket 939 or Socket AM2 motherboard, but anyone with a system that is now a couple of years old and starting to show its age stands to benefit from our little blast from the past. Lets see if it’s time to retire that once-mighty gaming system altogether or if a graphics upgrade can extend its useful life a while longer.

This thread is closed for comments
    Your comment
  • Schip
    FIRST POST!!! Nice Article though. I knew my brother would soon be doomed with his P4 2.8c ;)
  • "AMD Athlon 64 X2 4200 + dual-core, which has a 2.2 GHz Manchester architecture with 512 MB L2 cache per core."
    oau! that's a lot of cache :D
  • neiroatopelcc
    I haven't read the actual article yet, but I bet the simple answer is no!
    I've got a backup gaming rig at home that barely cuts it. An x2 1.9ghz (oc'ed to 2.4) with an 8800gtx and 3gb memory. That rig struggles at 1280x1024 in some situations, and it can only be attributed to the cpu really.
  • bf2gameplaya
    2.8GHz Opteron 185 (up from 2.6GHz) with 2x1MB L2 cache is the ultimate s939 CPU....blows these weak benchmarks away.

    Who would have thought DDR would have such durability? There's something to be said for CAS2!
  • cangelini
    Surprisingly, you can actually do fairly well. Of course, it depends on the app...
  • neiroatopelcc
    But your opteron cpu still limits the modern graphics cards.
    Two years back I bought my 8800gtx, and realized it wouldn't come to its full potential in my opteron 170 (@ 2.7). A friend with another gtx paired with an e6400 chip (@ 3ghz) scored a full 30% higher in 3dmark than I, and it showed in games. Even in wow where you'd expect a casio calculator would deliver enough graphics power.

    In short - ye ddr still work if you've got enthusiast parts, but that can't negate the effect a faster cpu would give. At least at decent resolutions (22" wide)
  • dirtmountain
    This is a great article! It will give me something to show when i'm talking to people about a new system or just a GPU/PSU upgrade. Great job by Henningsen.
  • NoIncentive
    I'm still using a P4 3.0 @ 3.4 with 1 GB DDR 400 and an nVidia 6800GT...

    I'm building a new computer next week.
  • randomizer
    I can echo the findings in Crysis. It didn't matter what settings I ran with a 3700 Sandy and an X1950 pro, the framerate was almost the same (albeit low 20s because the card is slower). Added an E6600 to the mix and my framerate tripled at lower settings.

    It would have been interesting to see how a 3000+ Clawhammer (C0 stepping) would do in Crysis. Single-channel memory, poor overclocking capabilities... FAIL!
  • ravenware
    bf2gameplaya2.8GHz Opteron 185 (up from 2.6GHz) with 2x1MB L2 cache is the ultimate s939 CPU....blows these weak benchmarks away.Who would have thought DDR would have such durability? There's something to be said for CAS2!

    Thia ia true about the DDR. I recall an article on toms right after the release of the AM2 socket which tested identical dual core processors against their 939 counterparts; the tests showed little to no performance gains.

    Great article, their has been some discussion about this in the forums as well.

    I currently own a 939 4200+ x2 that's paired with a 7800GT; and this article shows what I thought to be accurate about the AMD64 chips. Their not as fast as some of the C2D's but they still kick ass.

    Good job pointing out the single core factor in newer games too. As soon as the crysis demo was released I upgraded my San Diego core to a dual core and noticed the difference in crysis immediately.

    This article gives me further confidence in my decision to hold on upgrading my system. I want to hold out for Windows7 D3D11 and more money to build an ape sh** machine :D

    Nice article!!
  • giovanni86
    Good article, enjoyed it very much considering i run a AMD Athlon 3500+ venice core and have a XFX 9800GTX. Runs great, but big battles are very choppy and any high demanding game like COD4 and CRYSIS i have to suffer by not being able to max out settings. I almost blamed the GPU but i knew sooner then later i had to upgrade to a newer system then just opting in a newer GPU. I had a 6600GT which did great for the time being but it showed its age this past year. Great article!
  • groo
    I figured my 3600+ brisbane was realy holding me back, even when
    OC'd. when I got my 4850. looks like, for the most part it isn't because I like lots of eye candy.

    I still plan on upgrading soon :)
  • neiroatopelcc
    get 4 gigs memory instead of 3. If you're lucky the quadchannel memory will make its way into a future amd cpu and 3gb is pointles with hte current pricing on ddr2 anyway. Get 2x2gb and upgrade with 2x2gb more if you want. 2009 you may want to go get a 64bit windows 7 to match your hardware anyway :)
  • mohdwahidi
    thanks for the great article. this sure saves me a lot of money. i have AMD Athlon 64 X2 3800+ with HD4850. I can get to play Racedriver:GRID in Ultra setting. Reading your articles makes me think to upgrade it to 6000+.
  • da bahstid
    I'm impressed to see that the single core athlon didn't completely crash and burn anywhere. Just for kicks it would have been funny to see it try to drive a 4870X2. I can definitely tell it weighs down my 6400+X2 even at 3.45GHz. Entertainment value aside, this is one of the more objectively conducted articles I've seen here recently, and I actually like seeing some of these oddball scenarios getting played out. I can second the finding that an Athlon X2 at 2.4GHz as a ballpark minimum for driving an 8800/9800 or 4800 series card. Fortunately most X2s can overclock to around 3+GHz...other hardware allowing.
  • chill_king
    Interesting article. I find it amazing that its almost 2009 and i can still play the likes of stalker clear sky fairly well on my 939 asus a8n-sli motherboard that was released early 2005 with an aging 4200+ x2.

    I've always avoided upgrades just for the sake of so called "future proofing" and so have been trying to get the most out of my 4200+ X2 which i've overclocked from 2.2 to 2.8ghz (rock solid stable).

    Currently getting Getting 10500 in 3dmark06 and can handle most games to date at 1280 x 1024 bar crysis / warhead. I have 2 8800gt's in SLI so no GPU limitation there (i know its overkill but got them to use with my next rig).

    However, got my copy of farcry 2 today which i expect to be the nail in the coffin for my 939. With the likes of deadspace and fallout 3 still to play im looking at an e8400 to give my 8800gt's something to chew on.
  • neiroatopelcc
    Come to think of it my previous statement is not nessecarily right. I mean the first one. While and amd x2 does poorly in reallife compared to synthetic benches, my dad plays supreme commander on his old p4 2.4 (northwood) with 2gb pc3200 and a 7600gs card. Surely not at 1680x1050, but still quite well. So I suppose it depends on the games played. But for any game I'd be playing, my secondary pc really just isn't fast enough. Even the 3 raptors in stripe mode feel .... slow

    ps. can someone explain why half the time I can't post comments? I'm posting from an xp in vmware atm as the host os can't anymore. For now anyway.
  • feraltoad
    What a great article! I can see this as really being useful for those still hanging on and trying 2 decide what 2 do about upgrades.
  • feraltoad
    What a great article! I can see this as really being useful for those still hanging on and trying 2 decide what 2 do about upgrades.
  • Chuck Norris
    Haha, I could have used this article about a week ago! But, better late than never, as they say.

    I have a nearly 4 year old Dimension 8400 from Dell with a Pentium 4 @ 3.2GHz HT. I heard plenty of people say that getting a "good" video card for such a system would be a waste as the single core processor would bottleneck, but I opened up my 9600GT today (birthday gift!) and it's amazing (compared to my 7900GS I purchased 2 years ago). I have no doubt that upgrading to a nice C2Q would improve my performance even more but I am very happy so far.

    If you have an aging computer, don't hesitate to upgrade a component or two to keep you going until you can buy a new system, but do shoot for the best price for performance. (Pre-overclocked 9600GT for $80 from newegg after ($20) rebate and free shipping--superb!)
  • KRayner
    This article mirrored my experience. I recently upgraded from a A64 3200+ (single core, 2Ghz), 2Gb DDR400 + 8800GTS 320mb (XFX XXX edition) to a E8400 (dual core, 3Ghz), 4GB DDR2800 + the same GPU and I went from not being able to play a lot of newer games properly, to playing almost all of them at my LCD's default res (1680x1050) with highest detail settings. I knew the CPU was the limiting facter after playing Race Driver GRID (awesome game btw), didn't seem to matter which res I tried playing the game at, it still run badly. After the upgrade I now run the game @ 1680x1050 with all the settings at their highest and it's very playable, in most cases the game runs around 50-60FPS with some slowdown if there's a massive collision however that's the GPU and the limited 320mb memory I reckon. Another example was Mass Effect, after trying the game out on my old rig I was left very dissapointed as it run quite badly. In particular the Citadel level. was running around 15FPS no matter which res I changed too (well slight increases but nothing that made the game playable) but after the upgrade I finished the game (x2) on highest settings @ 1680x1050 and the game ran around 60FPS for the most part.

    I'm very happy with my setup atm however am looking into upgrading the GPU sometime within the next few months as I would love the play Crysis + Warhead @ 1680x1050 in DX10 (very high settings) as I had to play in DX9, 1280x800 with all settings on high. It still looked better than any other game out (even though the game is a year old) but I've got that 'it can look better' itch which I simply have too scratch :-p

    Am currently undecided on whether to get the 260+ or the 4870 1GB (with custom cooler of course), think I'll wait till after the holiday season. Sucks to buy anything atm anyways as the Rand $ollar sucks so badly atm (I live in South Africa, Cape Town) due to the recent financial crisis. 2-3 weeks ago it was 7.80 to 1 (R vs $) and now it's sitting at a insane 11.40 to 1 :-(
  • groo
    Dont worry, I have a fealing the US dollar will be crashing even further down in value after the election. Unfortunatly, it won't help you as much with everthing being made in Asia.
  • KRayner
    Wish it were that simple but the R/$ affects a lot of things in this country. Cars + tech come to mind. Put it this way, the Leadtek 260 was going for R2499 ex. vat (tax here in SA) and now it's going for R2999 ex. vat That's an increase of 25%, bear in mind this doesn't even include the worst of the weaker Rand as the pricelist will be updated next week. Not sure how much more expensive it's going to get but not looking forward to it regardless...
  • Mottamort
    @ KRayner

    dont you just hate the Rand/Dollar exchange. as a student i'm having to cope with a Athlong 3500+ and a 8600gt with 1 gig ram. upgrading just simply isnt an option as the different between entry level and mid range and high range motherdboards/CPU's/GPU's jumps tremendously with each step up. really sucks too because i'm admittedly an ATI fanboy and would have loved to see ANY 4000 series ATI. Too bad nvidia basically owns this country and ATI's are incredibly expensive (about $50-$100 more expensive than USA) meaning their "great value" factor is reduced to nothing...bleh