Can Your Old Athlon 64 Still Game?

Game Benchmarks: Crysis

Also released in November 2007, Crysis is one of the most hardware demanding games currently available. Even very high-end GPUs can be brought to a crawl if trying to play Crysis in DX10 at high resolutions. We take a look at both the CPU and GPU benchmarks included with the game, which readers at home can try on their own systems. While our performance numbers in these demos will not equal actual game play, they will place demands on both the GPU and CPU and give an idea of how these CPUs do keeping up with the 8800 GS. But it’s good to keep in mind that lower performance may be seen while gaming. We look at both all medium and all high settings with 16xAF.

As seen in CoD4, the single-core A64 4000+ is struggling with this new title. It was unable to average 30 FPS at any resolution, which was almost as low as the minimum FPS on the dual-core CPUs. Clearly medium settings, especially the physics effects, are too much for the single-core CPU.

The GPU benchmark is a flyby. Without the demands of a destructible environment and physics effects, we see that the single-core CPU is able to stay fairly close to the X2 4200+. As the resolution is raised, we see little drop in performance until 1600x1200, so the CPU is still limiting performance at the lower resolutions in the GPU benchmark. The X2 5600+ is able to put up much better numbers than the other two CPUs.

Enabling high details, we now see the X2 4200+ fall back compared to the X2 5600+ with its clock speed and L2 cache advantage. By the time we hit 1600x1200, the 8800 GS is not able to keep up and we see about equal performance with both dual-core CPUs. And as no surprise, the added demands of high details are too much for the single-core CPU.

At high details in the GPU test, the X2 5600+ is again able to provide the best low-resolution performance with the X2 4200+, barely averaging 30 FPS at 1024x768. The 8800 GS is hurting as we raise the resolution, and we see at stock speeds performance is well under 30 FPS at 1280x1024. It takes a hefty GPU to handle 1600x1200 high resolution and our 8800 GS is far from capable at these settings. Enabling FSAA would be pointless to try with this card, and better left to a card like the GTX 280, HD 4870x2, or an SLI/CrossFire solution.

In Crysis, gamers on a single-core processor may find they need to reduce physics effects all the way to low, which greatly impacts the fun and wow-factor of the game. While we could have used a more powerful GPU to handle the higher resolutions and high details, it’s still easy to see that Crysis demands a hefty CPU to get the best game play. Single-core CPU owners are out of luck, and lower-clocked dual-core CPU owners may want to look into overclocking for this game. Performance and limiting factors will vary by level, so don’t be surprised if settings need to be further turned down as the game progresses.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
140 comments
    Your comment
  • Schip
    FIRST POST!!! Nice Article though. I knew my brother would soon be doomed with his P4 2.8c ;)
    3
  • Anonymous
    "AMD Athlon 64 X2 4200 + dual-core, which has a 2.2 GHz Manchester architecture with 512 MB L2 cache per core."
    oau! that's a lot of cache :D
    5
  • neiroatopelcc
    I haven't read the actual article yet, but I bet the simple answer is no!
    I've got a backup gaming rig at home that barely cuts it. An x2 1.9ghz (oc'ed to 2.4) with an 8800gtx and 3gb memory. That rig struggles at 1280x1024 in some situations, and it can only be attributed to the cpu really.
    -14
  • bf2gameplaya
    2.8GHz Opteron 185 (up from 2.6GHz) with 2x1MB L2 cache is the ultimate s939 CPU....blows these weak benchmarks away.

    Who would have thought DDR would have such durability? There's something to be said for CAS2!
    5
  • cangelini
    Surprisingly, you can actually do fairly well. Of course, it depends on the app...
    0
  • neiroatopelcc
    But your opteron cpu still limits the modern graphics cards.
    Two years back I bought my 8800gtx, and realized it wouldn't come to its full potential in my opteron 170 (@ 2.7). A friend with another gtx paired with an e6400 chip (@ 3ghz) scored a full 30% higher in 3dmark than I, and it showed in games. Even in wow where you'd expect a casio calculator would deliver enough graphics power.

    In short - ye ddr still work if you've got enthusiast parts, but that can't negate the effect a faster cpu would give. At least at decent resolutions (22" wide)
    3
  • dirtmountain
    This is a great article! It will give me something to show when i'm talking to people about a new system or just a GPU/PSU upgrade. Great job by Henningsen.
    0
  • NoIncentive
    I'm still using a P4 3.0 @ 3.4 with 1 GB DDR 400 and an nVidia 6800GT...

    I'm building a new computer next week.
    1
  • randomizer
    I can echo the findings in Crysis. It didn't matter what settings I ran with a 3700 Sandy and an X1950 pro, the framerate was almost the same (albeit low 20s because the card is slower). Added an E6600 to the mix and my framerate tripled at lower settings.

    It would have been interesting to see how a 3000+ Clawhammer (C0 stepping) would do in Crysis. Single-channel memory, poor overclocking capabilities... FAIL!
    4
  • ravenware
    bf2gameplaya2.8GHz Opteron 185 (up from 2.6GHz) with 2x1MB L2 cache is the ultimate s939 CPU....blows these weak benchmarks away.Who would have thought DDR would have such durability? There's something to be said for CAS2!


    Thia ia true about the DDR. I recall an article on toms right after the release of the AM2 socket which tested identical dual core processors against their 939 counterparts; the tests showed little to no performance gains.

    Great article, their has been some discussion about this in the forums as well.

    I currently own a 939 4200+ x2 that's paired with a 7800GT; and this article shows what I thought to be accurate about the AMD64 chips. Their not as fast as some of the C2D's but they still kick ass.

    Good job pointing out the single core factor in newer games too. As soon as the crysis demo was released I upgraded my San Diego core to a dual core and noticed the difference in crysis immediately.

    This article gives me further confidence in my decision to hold on upgrading my system. I want to hold out for Windows7 D3D11 and more money to build an ape sh** machine :D

    Nice article!!
    0
  • giovanni86
    Good article, enjoyed it very much considering i run a AMD Athlon 3500+ venice core and have a XFX 9800GTX. Runs great, but big battles are very choppy and any high demanding game like COD4 and CRYSIS i have to suffer by not being able to max out settings. I almost blamed the GPU but i knew sooner then later i had to upgrade to a newer system then just opting in a newer GPU. I had a 6600GT which did great for the time being but it showed its age this past year. Great article!
    0
  • groo
    I figured my 3600+ brisbane was realy holding me back, even when
    OC'd. when I got my 4850. looks like, for the most part it isn't because I like lots of eye candy.

    I still plan on upgrading soon :)
    0
  • neiroatopelcc
    get 4 gigs memory instead of 3. If you're lucky the quadchannel memory will make its way into a future amd cpu and 3gb is pointles with hte current pricing on ddr2 anyway. Get 2x2gb and upgrade with 2x2gb more if you want. 2009 you may want to go get a 64bit windows 7 to match your hardware anyway :)
    -7
  • mohdwahidi
    thanks for the great article. this sure saves me a lot of money. i have AMD Athlon 64 X2 3800+ with HD4850. I can get to play Racedriver:GRID in Ultra setting. Reading your articles makes me think to upgrade it to 6000+.
    0
  • da bahstid
    I'm impressed to see that the single core athlon didn't completely crash and burn anywhere. Just for kicks it would have been funny to see it try to drive a 4870X2. I can definitely tell it weighs down my 6400+X2 even at 3.45GHz. Entertainment value aside, this is one of the more objectively conducted articles I've seen here recently, and I actually like seeing some of these oddball scenarios getting played out. I can second the finding that an Athlon X2 at 2.4GHz as a ballpark minimum for driving an 8800/9800 or 4800 series card. Fortunately most X2s can overclock to around 3+GHz...other hardware allowing.
    0
  • chill_king
    Interesting article. I find it amazing that its almost 2009 and i can still play the likes of stalker clear sky fairly well on my 939 asus a8n-sli motherboard that was released early 2005 with an aging 4200+ x2.

    I've always avoided upgrades just for the sake of so called "future proofing" and so have been trying to get the most out of my 4200+ X2 which i've overclocked from 2.2 to 2.8ghz (rock solid stable).

    Currently getting Getting 10500 in 3dmark06 and can handle most games to date at 1280 x 1024 bar crysis / warhead. I have 2 8800gt's in SLI so no GPU limitation there (i know its overkill but got them to use with my next rig).

    However, got my copy of farcry 2 today which i expect to be the nail in the coffin for my 939. With the likes of deadspace and fallout 3 still to play im looking at an e8400 to give my 8800gt's something to chew on.
    0
  • neiroatopelcc
    Come to think of it my previous statement is not nessecarily right. I mean the first one. While and amd x2 does poorly in reallife compared to synthetic benches, my dad plays supreme commander on his old p4 2.4 (northwood) with 2gb pc3200 and a 7600gs card. Surely not at 1680x1050, but still quite well. So I suppose it depends on the games played. But for any game I'd be playing, my secondary pc really just isn't fast enough. Even the 3 raptors in stripe mode feel .... slow

    ps. can someone explain why half the time I can't post comments? I'm posting from an xp in vmware atm as the host os can't anymore. For now anyway.
    -5
  • feraltoad
    What a great article! I can see this as really being useful for those still hanging on and trying 2 decide what 2 do about upgrades.
    0
  • feraltoad
    What a great article! I can see this as really being useful for those still hanging on and trying 2 decide what 2 do about upgrades.
    0
  • Chuck Norris
    Haha, I could have used this article about a week ago! But, better late than never, as they say.

    I have a nearly 4 year old Dimension 8400 from Dell with a Pentium 4 @ 3.2GHz HT. I heard plenty of people say that getting a "good" video card for such a system would be a waste as the single core processor would bottleneck, but I opened up my 9600GT today (birthday gift!) and it's amazing (compared to my 7900GS I purchased 2 years ago). I have no doubt that upgrading to a nice C2Q would improve my performance even more but I am very happy so far.

    If you have an aging computer, don't hesitate to upgrade a component or two to keep you going until you can buy a new system, but do shoot for the best price for performance. (Pre-overclocked 9600GT for $80 from newegg after ($20) rebate and free shipping--superb!)
    0