Skip to main content

Can Your Old Athlon 64 Still Game?

Radeon HD 4850: Adding More GPU Power

At the time testing began for this article, the 8800 GS was such a bargain and therefore a great option for putting into an aging system that didn’t deserve a very expensive GPU. Its biggest weakness was in our 1600x1200 4x FSAA testing.

In the mean time, AMD released the HD 4850 and brought a new level of performance to sub-$200 cards. This card is way more capable of handling high resolutions and levels of FSAA. Given the attractive pricing and the fact that our 8800 GS was often unable to handle our most demanding test settings, we decided to put in an HD 4850 and get another look at how well our three CPUs would keep up with it. At low resolutions, we would expect our slower CPUs to hold this card back even more than the 8800 GS did, so we concentrated on just the highest resolution previously tested. We put the 8800 GS results in for comparison, but keep in mind the performance of the 8800 GS had significantly dropped off in some of these tests.

Unlike the 8800GS, the HD 4850 was easily capable of 1600x1200 with 4x FSAA. We see slight CPU scaling by clock speed.

In Far Cry, we see our two video cards held back by our slower two CPUs and offering about the same performance. But notice, when paired with the X2 5600+, the HD 4850 is able to breathe and distance itself from the 8800GS.

These settings were unplayable with the 8800 GS, but were no problem for the HD 4850. We see similar CPU scaling here that we saw at low resolution with the 8800 GS.

We were pretty much all GPU limited with the 8800 GS once we reached these settings, but the HD 4850 has enough power to once again show just how important the CPU is in both of these areas of Oblivion. Without a doubt the X2 5600+ wins here, while the single-core CPU is unable to average 30 FPS in either test.

  • Schip
    FIRST POST!!! Nice Article though. I knew my brother would soon be doomed with his P4 2.8c ;)
    Reply
  • "AMD Athlon 64 X2 4200 + dual-core, which has a 2.2 GHz Manchester architecture with 512 MB L2 cache per core."
    oau! that's a lot of cache :D
    Reply
  • neiroatopelcc
    I haven't read the actual article yet, but I bet the simple answer is no!
    I've got a backup gaming rig at home that barely cuts it. An x2 1.9ghz (oc'ed to 2.4) with an 8800gtx and 3gb memory. That rig struggles at 1280x1024 in some situations, and it can only be attributed to the cpu really.
    Reply
  • bf2gameplaya
    2.8GHz Opteron 185 (up from 2.6GHz) with 2x1MB L2 cache is the ultimate s939 CPU....blows these weak benchmarks away.

    Who would have thought DDR would have such durability? There's something to be said for CAS2!
    Reply
  • cangelini
    Surprisingly, you can actually do fairly well. Of course, it depends on the app...
    Reply
  • neiroatopelcc
    But your opteron cpu still limits the modern graphics cards.
    Two years back I bought my 8800gtx, and realized it wouldn't come to its full potential in my opteron 170 (@ 2.7). A friend with another gtx paired with an e6400 chip (@ 3ghz) scored a full 30% higher in 3dmark than I, and it showed in games. Even in wow where you'd expect a casio calculator would deliver enough graphics power.

    In short - ye ddr still work if you've got enthusiast parts, but that can't negate the effect a faster cpu would give. At least at decent resolutions (22" wide)
    Reply
  • dirtmountain
    This is a great article! It will give me something to show when i'm talking to people about a new system or just a GPU/PSU upgrade. Great job by Henningsen.
    Reply
  • NoIncentive
    I'm still using a P4 3.0 @ 3.4 with 1 GB DDR 400 and an nVidia 6800GT...

    I'm building a new computer next week.
    Reply
  • randomizer
    I can echo the findings in Crysis. It didn't matter what settings I ran with a 3700 Sandy and an X1950 pro, the framerate was almost the same (albeit low 20s because the card is slower). Added an E6600 to the mix and my framerate tripled at lower settings.

    It would have been interesting to see how a 3000+ Clawhammer (C0 stepping) would do in Crysis. Single-channel memory, poor overclocking capabilities... FAIL!
    Reply
  • ravenware
    bf2gameplaya2.8GHz Opteron 185 (up from 2.6GHz) with 2x1MB L2 cache is the ultimate s939 CPU....blows these weak benchmarks away.Who would have thought DDR would have such durability? There's something to be said for CAS2!
    Thia ia true about the DDR. I recall an article on toms right after the release of the AM2 socket which tested identical dual core processors against their 939 counterparts; the tests showed little to no performance gains.

    Great article, their has been some discussion about this in the forums as well.

    I currently own a 939 4200+ x2 that's paired with a 7800GT; and this article shows what I thought to be accurate about the AMD64 chips. Their not as fast as some of the C2D's but they still kick ass.

    Good job pointing out the single core factor in newer games too. As soon as the crysis demo was released I upgraded my San Diego core to a dual core and noticed the difference in crysis immediately.

    This article gives me further confidence in my decision to hold on upgrading my system. I want to hold out for Windows7 D3D11 and more money to build an ape sh** machine :D

    Nice article!!
    Reply