Can Your Old Athlon 64 Still Game?
Tags:
- Hardware
- Games
Last response: in Reviews comments
Anonymous
October 24, 2008 5:50:03 AM
We'd all love to upgrade every time a new piece of gaming hardware drops, but that's an expensive proposition. You think your Athlon 64 system is fairly quick--any chance a simple graphics upgrade can bring it up speed? We're aiming to find out.
Can Your Old Athlon 64 Still Game? : Read more
Can Your Old Athlon 64 Still Game? : Read more
More about : athlon game
Schip
October 24, 2008 6:07:32 AM
Anonymous
October 24, 2008 6:33:19 AM
Related resources
- Best gpu for old Athlon 64 - Forum
- Need GPU for old Athlon 64 X2 4000+ system - Forum
- Can my old Athlon 64 x2 6000+ handle a ASUS GeForce GTX 750 Ti 2GB GDDR5? - Forum
- Is FX 8320 really much better than an old Athlon II x3 455 for gaming or , is it better to save for a I5 4570 - Forum
- Best upgrade options for old Athlon 64 desktop? - Forum
neiroatopelcc
October 24, 2008 6:39:56 AM
Score
-14
bf2gameplaya
October 24, 2008 6:52:20 AM
neiroatopelcc
October 24, 2008 6:57:26 AM
But your opteron cpu still limits the modern graphics cards.
Two years back I bought my 8800gtx, and realized it wouldn't come to its full potential in my opteron 170 (@ 2.7). A friend with another gtx paired with an e6400 chip (@ 3ghz) scored a full 30% higher in 3dmark than I, and it showed in games. Even in wow where you'd expect a casio calculator would deliver enough graphics power.
In short - ye ddr still work if you've got enthusiast parts, but that can't negate the effect a faster cpu would give. At least at decent resolutions (22" wide)
Two years back I bought my 8800gtx, and realized it wouldn't come to its full potential in my opteron 170 (@ 2.7). A friend with another gtx paired with an e6400 chip (@ 3ghz) scored a full 30% higher in 3dmark than I, and it showed in games. Even in wow where you'd expect a casio calculator would deliver enough graphics power.
In short - ye ddr still work if you've got enthusiast parts, but that can't negate the effect a faster cpu would give. At least at decent resolutions (22" wide)
Score
3
dirtmountain
October 24, 2008 7:14:41 AM
NoIncentive
October 24, 2008 7:28:02 AM
I can echo the findings in Crysis. It didn't matter what settings I ran with a 3700 Sandy and an X1950 pro, the framerate was almost the same (albeit low 20s because the card is slower). Added an E6600 to the mix and my framerate tripled at lower settings.
It would have been interesting to see how a 3000+ Clawhammer (C0 stepping) would do in Crysis. Single-channel memory, poor overclocking capabilities... FAIL!
It would have been interesting to see how a 3000+ Clawhammer (C0 stepping) would do in Crysis. Single-channel memory, poor overclocking capabilities... FAIL!
Score
4
ravenware
October 24, 2008 7:44:22 AM
bf2gameplaya2.8GHz Opteron 185 (up from 2.6GHz) with 2x1MB L2 cache is the ultimate s939 CPU....blows these weak benchmarks away.Who would have thought DDR would have such durability? There's something to be said for CAS2!
Thia ia true about the DDR. I recall an article on toms right after the release of the AM2 socket which tested identical dual core processors against their 939 counterparts; the tests showed little to no performance gains.
Great article, their has been some discussion about this in the forums as well.
I currently own a 939 4200+ x2 that's paired with a 7800GT; and this article shows what I thought to be accurate about the AMD64 chips. Their not as fast as some of the C2D's but they still kick ass.
Good job pointing out the single core factor in newer games too. As soon as the crysis demo was released I upgraded my San Diego core to a dual core and noticed the difference in crysis immediately.
This article gives me further confidence in my decision to hold on upgrading my system. I want to hold out for Windows7 D3D11 and more money to build an ape sh** machine
Nice article!!
Score
0
giovanni86
October 24, 2008 8:32:13 AM
Good article, enjoyed it very much considering i run a AMD Athlon 3500+ venice core and have a XFX 9800GTX. Runs great, but big battles are very choppy and any high demanding game like COD4 and CRYSIS i have to suffer by not being able to max out settings. I almost blamed the GPU but i knew sooner then later i had to upgrade to a newer system then just opting in a newer GPU. I had a 6600GT which did great for the time being but it showed its age this past year. Great article!
Score
0
groo
October 24, 2008 8:41:00 AM
neiroatopelcc
October 24, 2008 8:59:48 AM
Score
-7
mohdwahidi
October 24, 2008 9:05:35 AM
da bahstid
October 24, 2008 9:18:51 AM
I'm impressed to see that the single core athlon didn't completely crash and burn anywhere. Just for kicks it would have been funny to see it try to drive a 4870X2. I can definitely tell it weighs down my 6400+X2 even at 3.45GHz. Entertainment value aside, this is one of the more objectively conducted articles I've seen here recently, and I actually like seeing some of these oddball scenarios getting played out. I can second the finding that an Athlon X2 at 2.4GHz as a ballpark minimum for driving an 8800/9800 or 4800 series card. Fortunately most X2s can overclock to around 3+GHz...other hardware allowing.
Score
0
chill_king
October 24, 2008 9:27:25 AM
Interesting article. I find it amazing that its almost 2009 and i can still play the likes of stalker clear sky fairly well on my 939 asus a8n-sli motherboard that was released early 2005 with an aging 4200+ x2.
I've always avoided upgrades just for the sake of so called "future proofing" and so have been trying to get the most out of my 4200+ X2 which i've overclocked from 2.2 to 2.8ghz (rock solid stable).
Currently getting Getting 10500 in 3dmark06 and can handle most games to date at 1280 x 1024 bar crysis / warhead. I have 2 8800gt's in SLI so no GPU limitation there (i know its overkill but got them to use with my next rig).
However, got my copy of farcry 2 today which i expect to be the nail in the coffin for my 939. With the likes of deadspace and fallout 3 still to play im looking at an e8400 to give my 8800gt's something to chew on.
I've always avoided upgrades just for the sake of so called "future proofing" and so have been trying to get the most out of my 4200+ X2 which i've overclocked from 2.2 to 2.8ghz (rock solid stable).
Currently getting Getting 10500 in 3dmark06 and can handle most games to date at 1280 x 1024 bar crysis / warhead. I have 2 8800gt's in SLI so no GPU limitation there (i know its overkill but got them to use with my next rig).
However, got my copy of farcry 2 today which i expect to be the nail in the coffin for my 939. With the likes of deadspace and fallout 3 still to play im looking at an e8400 to give my 8800gt's something to chew on.
Score
0
neiroatopelcc
October 24, 2008 9:32:50 AM
Come to think of it my previous statement is not nessecarily right. I mean the first one. While and amd x2 does poorly in reallife compared to synthetic benches, my dad plays supreme commander on his old p4 2.4 (northwood) with 2gb pc3200 and a 7600gs card. Surely not at 1680x1050, but still quite well. So I suppose it depends on the games played. But for any game I'd be playing, my secondary pc really just isn't fast enough. Even the 3 raptors in stripe mode feel .... slow
ps. can someone explain why half the time I can't post comments? I'm posting from an xp in vmware atm as the host os can't anymore. For now anyway.
ps. can someone explain why half the time I can't post comments? I'm posting from an xp in vmware atm as the host os can't anymore. For now anyway.
Score
-5
feraltoad
October 24, 2008 9:43:14 AM
feraltoad
October 24, 2008 9:43:14 AM
Chuck Norris
October 24, 2008 10:19:52 AM
Haha, I could have used this article about a week ago! But, better late than never, as they say.
I have a nearly 4 year old Dimension 8400 from Dell with a Pentium 4 @ 3.2GHz HT. I heard plenty of people say that getting a "good" video card for such a system would be a waste as the single core processor would bottleneck, but I opened up my 9600GT today (birthday gift!) and it's amazing (compared to my 7900GS I purchased 2 years ago). I have no doubt that upgrading to a nice C2Q would improve my performance even more but I am very happy so far.
If you have an aging computer, don't hesitate to upgrade a component or two to keep you going until you can buy a new system, but do shoot for the best price for performance. (Pre-overclocked 9600GT for $80 from newegg after ($20) rebate and free shipping--superb!)
I have a nearly 4 year old Dimension 8400 from Dell with a Pentium 4 @ 3.2GHz HT. I heard plenty of people say that getting a "good" video card for such a system would be a waste as the single core processor would bottleneck, but I opened up my 9600GT today (birthday gift!) and it's amazing (compared to my 7900GS I purchased 2 years ago). I have no doubt that upgrading to a nice C2Q would improve my performance even more but I am very happy so far.
If you have an aging computer, don't hesitate to upgrade a component or two to keep you going until you can buy a new system, but do shoot for the best price for performance. (Pre-overclocked 9600GT for $80 from newegg after ($20) rebate and free shipping--superb!)
Score
0
KRayner
October 24, 2008 10:21:34 AM
This article mirrored my experience. I recently upgraded from a A64 3200+ (single core, 2Ghz), 2Gb DDR400 + 8800GTS 320mb (XFX XXX edition) to a E8400 (dual core, 3Ghz), 4GB DDR2800 + the same GPU and I went from not being able to play a lot of newer games properly, to playing almost all of them at my LCD's default res (1680x1050) with highest detail settings. I knew the CPU was the limiting facter after playing Race Driver GRID (awesome game btw), didn't seem to matter which res I tried playing the game at, it still run badly. After the upgrade I now run the game @ 1680x1050 with all the settings at their highest and it's very playable, in most cases the game runs around 50-60FPS with some slowdown if there's a massive collision however that's the GPU and the limited 320mb memory I reckon. Another example was Mass Effect, after trying the game out on my old rig I was left very dissapointed as it run quite badly. In particular the Citadel level. was running around 15FPS no matter which res I changed too (well slight increases but nothing that made the game playable) but after the upgrade I finished the game (x2) on highest settings @ 1680x1050 and the game ran around 60FPS for the most part.
I'm very happy with my setup atm however am looking into upgrading the GPU sometime within the next few months as I would love the play Crysis + Warhead @ 1680x1050 in DX10 (very high settings) as I had to play in DX9, 1280x800 with all settings on high. It still looked better than any other game out (even though the game is a year old) but I've got that 'it can look better' itch which I simply have too scratch :-p
Am currently undecided on whether to get the 260+ or the 4870 1GB (with custom cooler of course), think I'll wait till after the holiday season. Sucks to buy anything atm anyways as the Rand $ollar sucks so badly atm (I live in South Africa, Cape Town) due to the recent financial crisis. 2-3 weeks ago it was 7.80 to 1 (R vs $) and now it's sitting at a insane 11.40 to 1 :-(
I'm very happy with my setup atm however am looking into upgrading the GPU sometime within the next few months as I would love the play Crysis + Warhead @ 1680x1050 in DX10 (very high settings) as I had to play in DX9, 1280x800 with all settings on high. It still looked better than any other game out (even though the game is a year old) but I've got that 'it can look better' itch which I simply have too scratch :-p
Am currently undecided on whether to get the 260+ or the 4870 1GB (with custom cooler of course), think I'll wait till after the holiday season. Sucks to buy anything atm anyways as the Rand $ollar sucks so badly atm (I live in South Africa, Cape Town) due to the recent financial crisis. 2-3 weeks ago it was 7.80 to 1 (R vs $) and now it's sitting at a insane 11.40 to 1 :-(
Score
0
groo
October 24, 2008 10:54:50 AM
KRayner
October 24, 2008 11:28:45 AM
Wish it were that simple but the R/$ affects a lot of things in this country. Cars + tech come to mind. Put it this way, the Leadtek 260 was going for R2499 ex. vat (tax here in SA) and now it's going for R2999 ex. vat That's an increase of 25%, bear in mind this doesn't even include the worst of the weaker Rand as the pricelist will be updated next week. Not sure how much more expensive it's going to get but not looking forward to it regardless...
Score
-1
Mottamort
October 24, 2008 11:55:59 AM
@ KRayner
dont you just hate the Rand/Dollar exchange. as a student i'm having to cope with a Athlong 3500+ and a 8600gt with 1 gig ram. upgrading just simply isnt an option as the different between entry level and mid range and high range motherdboards/CPU's/GPU's jumps tremendously with each step up. really sucks too because i'm admittedly an ATI fanboy and would have loved to see ANY 4000 series ATI. Too bad nvidia basically owns this country and ATI's are incredibly expensive (about $50-$100 more expensive than USA) meaning their "great value" factor is reduced to nothing...bleh
dont you just hate the Rand/Dollar exchange. as a student i'm having to cope with a Athlong 3500+ and a 8600gt with 1 gig ram. upgrading just simply isnt an option as the different between entry level and mid range and high range motherdboards/CPU's/GPU's jumps tremendously with each step up. really sucks too because i'm admittedly an ATI fanboy and would have loved to see ANY 4000 series ATI. Too bad nvidia basically owns this country and ATI's are incredibly expensive (about $50-$100 more expensive than USA) meaning their "great value" factor is reduced to nothing...bleh
Score
0
mitch074
October 24, 2008 11:59:53 AM
I bought a s939-based X2 3800+ in 2005, with 1 Gb of RAM (in a single stick), and a Geforce 6600 (not GT). I upgraded the RAM to 2x1 Gb right after that, and used to tweak latencies quite a lot (I now leave it at 3-3-3-5, 1T).
It served me rather well for quite some time (I'm not a gamer), but when I got a 1680x1050 screen recently, the Geforce cried 'ugh'.
And the Radeon HD4850 came out.
My CPU is 10% slower than the 4200 shown in benchmarks here; still, for those times I play, it now pulls its weight quite handily.
I must admit I removed most background tasks from the OS, probably freeing like 15% CPU time...
It served me rather well for quite some time (I'm not a gamer), but when I got a 1680x1050 screen recently, the Geforce cried 'ugh'.
And the Radeon HD4850 came out.
My CPU is 10% slower than the 4200 shown in benchmarks here; still, for those times I play, it now pulls its weight quite handily.
I must admit I removed most background tasks from the OS, probably freeing like 15% CPU time...
Score
0
Kohlhagen
October 24, 2008 12:18:14 PM
I still use the Asus A8R32-MVP Deluxe with an Opteron 165 (1.8) OCed to 2.7 and 2x1GB DDR400 CL2-2-3-2. I have an extra A8R32-MVP and would love to send it to you guys are going to have a part 2 for this..
I've also got a 3870, so I've always been curious how crossfiring it on my aging system would show any performance..
I've also got a 3870, so I've always been curious how crossfiring it on my aging system would show any performance..
Score
0
KRayner
October 24, 2008 12:20:20 PM
@ mottamort
Yeah it does suck, although I'm not quite sure where you come with the 'ATI cards are way more expensive than their Nvidia counterparts' theory. Looking at Frontosa's pricelist now and a Asus 4870 costs R600 less than a Asus 260. I've always supported the best bang for the buck solution. I've had Intel, AMD and even Cyrix cpus's as well as SiS, 3Dfx, Nvidia & AMD video cards. It's always about price vs performance for me. I do love the fact the ATi forced Nvidia to bring down their prices BIG time just to compete, folks like us really appreciate it ;-)
mottamort, where in SA do you live?
Yeah it does suck, although I'm not quite sure where you come with the 'ATI cards are way more expensive than their Nvidia counterparts' theory. Looking at Frontosa's pricelist now and a Asus 4870 costs R600 less than a Asus 260. I've always supported the best bang for the buck solution. I've had Intel, AMD and even Cyrix cpus's as well as SiS, 3Dfx, Nvidia & AMD video cards. It's always about price vs performance for me. I do love the fact the ATi forced Nvidia to bring down their prices BIG time just to compete, folks like us really appreciate it ;-)
mottamort, where in SA do you live?
Score
0
KRayner
October 24, 2008 12:25:17 PM
@ Kohlhagen
Have a look at the VGA charts to make a decision however in my experience multi-card setups are always a bad idea. Biggest problem being game compatibility vs performance gained for the cost. I've always found it better to replace the aging card for a few bucks more and have a guaranteed performance increase rather than a negligble increase in some titles. The multi-card idea is a good one but the reality is unforunately not as good :-(
Have a look at the VGA charts to make a decision however in my experience multi-card setups are always a bad idea. Biggest problem being game compatibility vs performance gained for the cost. I've always found it better to replace the aging card for a few bucks more and have a guaranteed performance increase rather than a negligble increase in some titles. The multi-card idea is a good one but the reality is unforunately not as good :-(
Score
0
KRayner
October 24, 2008 12:37:45 PM
@ Kohlhagen...again
Look here quick: http://www.tomshardware.com/charts/gaming-graphics-char...
It's a comparison of 3870's in Crossfire vs a single 4850. Looking on NewEgg the average diff between the two cards (1 3870 vs 1 4850) seems to be around $40. Now the performance is quite similar however bear in mind that in some titles the performance deficit for the CF setup is quite notable, highlighting my problems with this approach.
Just my 2c ;-)
Look here quick: http://www.tomshardware.com/charts/gaming-graphics-char...
It's a comparison of 3870's in Crossfire vs a single 4850. Looking on NewEgg the average diff between the two cards (1 3870 vs 1 4850) seems to be around $40. Now the performance is quite similar however bear in mind that in some titles the performance deficit for the CF setup is quite notable, highlighting my problems with this approach.
Just my 2c ;-)
Score
0
Mottamort
October 24, 2008 12:44:58 PM
@ KRayner. Stellenbosch, and I dont have the luxury of online stores, seeing as the banks are unwilling to allow me credit ^_^. So when I say the prices are different i'm referring to the actual crappy pc hardware SHOPS which are a complete ripoff. But i'll have a look at the Frontosa's pricelist. I've been looking at Rectron prices, although not recently.
Score
-1
Mottamort
October 24, 2008 12:47:45 PM
crowheart27us
October 24, 2008 12:58:41 PM
Well I'm about to give this processeor(Athlon x2 5600+ lol)to my brother but after seeing this article I'm having second thoughts! Actually after I show him this he should be happy with the budget rig I'm going to build for him. Gives me a reason to get my new cpu.
Currently getting 11528 3dmark06 on my 8800gt sli system with this processor.
Currently getting 11528 3dmark06 on my 8800gt sli system with this processor.
Score
0
KRayner
October 24, 2008 1:09:09 PM
@ mottamort
I stay in Kraaifontein so not too far away geographically speaking. I find the Gigabyte GFX prices are very high, I was quoting on a Leadtek 260 which is the exact same card as the Gigabyte but cost (before price increases) R1000 less ex. vat. Currently it's a bad time to buy any tech here in SA, will be waiting until next year before getting a new GFX card.
I stay in Kraaifontein so not too far away geographically speaking. I find the Gigabyte GFX prices are very high, I was quoting on a Leadtek 260 which is the exact same card as the Gigabyte but cost (before price increases) R1000 less ex. vat. Currently it's a bad time to buy any tech here in SA, will be waiting until next year before getting a new GFX card.
Score
-1
coulsond
October 24, 2008 1:53:56 PM
forgive my ignorance but how is the amd athlon 64 4000+ single core directly comparable to an athlon 64 x2 4800+ (both 939 core)? I have teh dual core version, looking at the tests such as crysis, if mine is dual core, surely it would be better as the game is utilising multithreading. I am curious to know as I have 2 x 6800gt's in sli and was wondering whether to upgrade to a radeon 4850. Cheers.
Score
0
snarfies1
October 24, 2008 2:06:55 PM
Up until a few months ago I was using an 939 Athlon 64 (Venice) 3000. It worked fairly well for the games I played the most (Civ 4, Simcity 4). Gothic 3 ran fairly well so long as I used lower settings. I upgraded to a Q9450 as soon as it came out, and Gothic 3, using the same video card, ran even more smoothly with all of the setting cranked to the max. So yeah, it can make a difference.
Score
0
computerninja7823
October 24, 2008 2:29:42 PM
this is kinda not related but i gotta throw my two cents in...i was able to play half life 2 maxed out at 1152x864 with no lag on a geforce 2 and a p4 at 1.7ghz!. old school parts can kick some bootay! a buddy of mine was able to half-life 2 episode 2 maxed at 1024x768 with a single core athalon at 2.4ghz or something with a hd 2400 pro!
Score
-3
malveaux
October 24, 2008 2:39:04 PM
Heya,
Way too much stress on the whole "higher clock" to make the stronger GPU's run games well. And AMD5000+ plays all games, including Crysis & COD4, on my 42" 1080p HDTV on my 8800gt 1gig overclocked at 1920x1080 with good frame rates. It's not 40+ "ALL THE TIME" but it's playable and nice. When I drop it down to 1680x1080 on my 22" LCD, it's completely smooth and no hiccups. And this is at High settings.
The single core CPU's are long dead. Yes. They bottleneck the crap out of new GPU's.
But you certainly don't need some $150 CPU to run it. A $40 brisbane will run 'em. And the `marked' significance is not accurately described here on that.
Cheers,
Way too much stress on the whole "higher clock" to make the stronger GPU's run games well. And AMD5000+ plays all games, including Crysis & COD4, on my 42" 1080p HDTV on my 8800gt 1gig overclocked at 1920x1080 with good frame rates. It's not 40+ "ALL THE TIME" but it's playable and nice. When I drop it down to 1680x1080 on my 22" LCD, it's completely smooth and no hiccups. And this is at High settings.
The single core CPU's are long dead. Yes. They bottleneck the crap out of new GPU's.
But you certainly don't need some $150 CPU to run it. A $40 brisbane will run 'em. And the `marked' significance is not accurately described here on that.
Cheers,
Score
0
Worf101
October 24, 2008 2:56:34 PM
Hmmm confirms what I've suspected. I'm running an FX-60 with a 3870 and I can handle most new games on reduced settings. My primary game "IL2 Sturmovik 1946" is single threaded anyway so I've always been able to run it well on XP Pro.
Next year when the AM-3's are out I'll build a whole new rig and bequeath my current one to my son. Hopefully AMD will deliver da goods and I can buy "Crysis" and "FarCry 2" and mash em up to the max.
Da Worfster
Next year when the AM-3's are out I'll build a whole new rig and bequeath my current one to my son. Hopefully AMD will deliver da goods and I can buy "Crysis" and "FarCry 2" and mash em up to the max.
Da Worfster
Score
0
pauldh
October 24, 2008 2:57:03 PM
To All - Thank you for the comments. Hopefully many readers will benefit from the article and the open discussion.
Thanks for mentioning the Opterons. Great dual-core 939 chips if they can still be found. Knowing every option to search helps if people are hunting for a 939 dual-core CPU. Opteron 185 (and the unlocked FX-60) being the top 939 processors out there.
Thanks for mentioning the Opterons. Great dual-core 939 chips if they can still be found. Knowing every option to search helps if people are hunting for a 939 dual-core CPU. Opteron 185 (and the unlocked FX-60) being the top 939 processors out there.
Score
2
ricstorms
October 24, 2008 2:59:39 PM
Very nice article. Another thing to consider with a socket 754 processor is it uses single channel memory, providing a further bottleneck for performance. It would have been nice to see on old FX processor give these games a try on the single-core side, but they are near impossible to get on ebay for less than $500 or so. I remember getting Oblivion to run decently with a 3800+ x2 and two 7600GTs in SLI on my old Abit nForce 4 board. It would have been funny to see an old Pentium D in the mix, just to give the AMD fanboys something to feel good about.
Score
0
pauldh
October 24, 2008 3:03:10 PM
coulsondforgive my ignorance but how is the amd athlon 64 4000+ single core directly comparable to an athlon 64 x2 4800+ (both 939 core)? I have teh dual core version, looking at the tests such as crysis, if mine is dual core, surely it would be better as the game is utilising multithreading. I am curious to know as I have 2 x 6800gt's in sli and was wondering whether to upgrade to a radeon 4850. Cheers.
To clarify, if wanting to compare single-core vs dual-core on an equal playing field, we would compare your AMD Athlon 64 X2 4800+ to a AMD Athlon 64 4000+ as both have the same clock speeds and amount of L2 cache (per core).
Back in older games, these two performed equally, when now in newer games, yes your X2 4800+ (being a dual-core) is a way more capable gaming CPU.
Score
2
Anonymous
October 24, 2008 3:03:37 PM
this article isn't bad. Though it does put too much emphasis on clock speed for the dual core procs. It proves that in order to cpu bind a dual core cpu system (of pretty much any speed) you will need a radeon4850/9800GTX+. I assume no one is running 1024x768 anymore which means it isn't even worth testing this resolution.
Score
0
enewmen
October 24, 2008 3:12:55 PM
bourgeoisdude
October 24, 2008 3:14:06 PM
pauldh
October 24, 2008 3:20:45 PM
malveauxHeya,Way too much stress on the whole "higher clock" to make the stronger GPU's run games well. And AMD5000+ plays all games, including Crysis & COD4, on my 42" 1080p HDTV on my 8800gt 1gig overclocked at 1920x1080 with good frame rates. It's not 40+ "ALL THE TIME" but it's playable and nice. When I drop it down to 1680x1080 on my 22" LCD, it's completely smooth and no hiccups. And this is at High settings.The single core CPU's are long dead. Yes. They bottleneck the crap out of new GPU's.But you certainly don't need some $150 CPU to run it. A $40 brisbane will run 'em. And the `marked' significance is not accurately described here on that.Cheers,
Keep in mind, your X2 5000+ brisbane is a 2.6GHz dual-core, hardly a slouch by any means even at its stock speeds. It is far more capable than if it were running 1.8-2.2GHz like the X2 3800+ to 4200+. The data points out there is a large potential difference between lower clock speeds and higher clock speeds in these dual cores. Just how much, will vary depending on the game, the resolution, fsaa level, and the GPU you pair it with. Thanks for the comment.
Score
1
bourgeoisdude
October 24, 2008 3:20:50 PM
kobyhud...I assume no one is running 1024x768 anymore which means it isn't even worth testing this resolution.
I run 1024x768 on my 22" Widescreen. I truley don't care that much about increasing the resolution vs. increasing the detail settings. In some games there is a bigger difference than others, but I find that increasing the resolution doesn't help the detail that much to me except for some of the most recent games.
Score
1
pauldh
October 24, 2008 3:28:31 PM
bourgeoisdudeI'm running the same mobo in the test using an old Athlon 64 X2 4400+ with the GeForce 8800GTS 320MB and I'm content.
That's still a capable rig. With that mobo, you can disable one core in the bios and try your X2 4400+ as a single core. Easy and kinda fun if you want to see just how much better off you are in a certain game. If this system is used again for testing, that may be the method used vs swapping CPU's.
Score
2
timaahhh
October 24, 2008 3:33:09 PM
neiroatopelccBut your opteron cpu still limits the modern graphics cardshttp://en.wikipedia.org/wiki/Video_card . Two years back I bought my 8800gtx, and realized it wouldn't come to its full potential in my opteron 170 (@ 2.7). A friend with another gtx paired with an e6400 chip (@ 3ghz) scored a full 30% higher in 3dmark than I, and it showed in games. Even in wow where you'd expect a casio calculator would deliver enough graphics power. In short - ye ddr still work if you've got enthusiast parts, but that can't negate the effect a faster cpu would give. At least at decent resolutions (22" wide)
LoL at casio calculator.
Score
-1
jerreece
October 24, 2008 4:12:54 PM
My wife runs an Athlon 64 4400+ with an X1950 PRO 512MB and 2GB DDR (PC3200). Though I personally wouldn't want to play Crysis with it, it runs Warhammer Online pretty decently (at least she never complains about it). There's definitely some use left in these older systems (WoW and WAR for instance). But for the latest greatest games, system upgrades would benefit a lot.
Score
0
Related resources
- SolvedOverclocking old cpu (Athlon 64 3200+) Forum
- Overclocking my old AMD Athlon 64 x2 5400+ Forum
- Windows 7 64 bit blue screen since re-install of a old game , NO restore point :((( help please Forum
- Athlon 64 3700+ (for those still into older setups!) Forum
- AMD still producing Athlon 64 4000? Forum
- WTS/ Old MSI 939 Mobo, AMD Athlon 64 X2 (OEM/tray microprocessor), Rocketfish power supply, DDR 512 RAM Forum
- Still the same probem : AMD 3700+ Athlon 64 Lock Up? Forum
- Solvedwill this upgrade from a ATHLON 64X2 DUAL CORE 2.8GHZ to a AMD Athlon X4 760k Black Edition Quad Processo be good for gaming Forum
- How to over clock a old Athlon 64 x2 4200+ Forum
- SolvedHey was just wondering what type of games I would be able to run with these specs GPU HD 7770CPU amd Athlon 64 x2 dual c Forum
- Nba 2k10 (game) is still slow after i bought athlon II x4 @ 2.8 Ghz Forum
- SolvedHow much should i sell my old n64 games for? Forum
- Can I get something for my old Athlon 64 3500+ (2.2) ? Forum
- Windows 7 worth on a old athlon 64 core? Forum
- Questions about reusing CPU (Athlon x2 64) & Heatsink from old Gateway Forum
- More resources
!