Sign in with
Sign up | Sign in
Your question

Core E4300 or a X2 3600?

Last response: in CPUs
Share
May 3, 2007 10:44:14 AM

Hi there guys,

I'm a hard-core Half-Life series gamer, including mapmaking, so I might need some more power out of it without hurting my wallet too much.

I've been checking the E6300 and the E4300 benchmarks, but most of the tech sites never show me the actual maximum framerates of HL2, and not even one comparison between single-and-dual core.

Since I've a Pentium 4 3.06GHz (775), which one actually worth the upgrade, the AMD's 3600 or the E4300? I know the E4300 could be a little bit more than the 3800, but which one has better overall?

And, not thinking of any overclocks. Just want my HL2 to load faster, compared to my current's 30 seconds, and run in more framerates. My current framerates are 46FPS for your info.

Any ideas? 8)

More about : core e4300 3600

May 3, 2007 12:04:25 PM

The E4300 would be ~20% faster than the X2 3600+ in most tasks. Of course it costs more as well. If you are willing to overclock it then I'd say go the E4300, but at stock speeds theres not a huge difference between them.

If you get the X2 3600+, make sure to use DDR2-800 RAM, if you use DDR2-533 (I'm assuming your current system has some?) it may not end up much faster than your old P4. ;) 

Oh, and HL2 is quite CPU bound, but it's not multithreaded at this point. Episode 2 may change that however.
May 3, 2007 12:20:11 PM

Quote:
Hi there guys,

I'm a hard-core Half-Life series gamer, including mapmaking, so I might need some more power out of it without hurting my wallet too much.

I've been checking the E6300 and the E4300 benchmarks, but most of the tech sites never show me the actual maximum framerates of HL2, and not even one comparison between single-and-dual core.

Since I've a Pentium 4 3.06GHz (775), which one actually worth the upgrade, the AMD's 3600 or the E4300? I know the E4300 could be a little bit more than the 3800, but which one has better overall?

And, not thinking of any overclocks. Just want my HL2 to load faster, compared to my current's 30 seconds, and run in more framerates. My current framerates are 46FPS for your info.

Any ideas? 8)

At high resolutions the E4300 will only give a few extra frames per second. The E4300 is over 2 times the cost of the X2 3600 and if playing was all you were doing I would go for the X2 3600. Map editing on the other hand is going to benafit a good deal from the E4300. The E4300 gets done with maps about %30 faster than the X2 3600.

It really comes down to how much map editing you do.
Related resources
May 3, 2007 12:48:29 PM

Quote:
Well the 3600+ is obviously the best buy for the money. $59, $69 retail now at newegg I think, and everyone is overclocking them to 3ghz.


'Everyone'? How many people are running an X2 3600+ @ 3GHz in these forums, the highest I've seen is 2.8GHz. Sure, *some* people are getting 3GHz on other forums, but that is the upper limit more than the norm.

An E4300 is ~$115 retail and should overclock to 3GHz easily, 3.5GHz if you have good cooling. At such speeds it would blow away any overclocked X2. You'd need an X2 @ 4.2GHz to compete... which ain't gonna happen short of LN2.

Overall platform cost would be about $50 - $100 higher for the E4300 depending on the mobo, but don't act as if the X2 3600+ has an edge in overclocking, because it will utterly get spanked by the E4300.

As an owner of an E6300 @ 3.2GHz, you should probably know that already. If the X2 3600+ is such a great overclocker why don't you just replace your current system with one then? :lol:  :wink:
May 3, 2007 12:49:51 PM

Quote:
Well the 3600+ is obviously the best buy for the money. $59, $69 retail now at newegg I think, and everyone is overclocking them to 3ghz.

True but at the same time both e4300 and x2 3600 are good OCers. Both OC'ed end at about the same performance to price.
May 3, 2007 12:54:00 PM

If you're having frame rate troubles, your best bet is to upgrade your RAM and GFX card.
May 3, 2007 1:02:40 PM

Quote:
If you're having frame rate troubles, your best bet is to upgrade your RAM and GFX card.


I've played HL2 on my old P4 rig and the framerate can dip quite low in some areas, it's quite a CPU bound game once you have a semi decent GPU.

http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...

As you can see in the last test framerates drop down to the 50s on the P4s, and thats AVG framerate. MIN framerates would be far worse, I can assure you. :wink:
May 3, 2007 1:20:22 PM

It all depends on how you look at it.

The X2 3600 is 1/2 the cost of the E4300.
Yes the E4300 will not yield a computer twice as fast.

So logically from a price/performance ratio the X2 would win?

Well, may be not.

If the system price for one is $1000 and the other is $1100 due to CPU/MB price differences, then we are talking about a %10 difference in cost. In this case the E4300 would be the best choice since it easily beats the X2 3600 on a clock for clock basis and can clock higher to boot.

In the end, the E4300 is hands down better but will cost u a bit more.
If u can afford the xtra $75-$100, go for it. If not, the X2 3600 will still get u a good system.
May 3, 2007 1:21:35 PM

Quote:
Dear God..


It's doesn't run that bad on a P4 TBH, still totally playable but on certain levels I did wish I had an A64 back in the day. :wink:
May 3, 2007 1:27:48 PM

Quote:
Well the 3600+ is obviously the best buy for the money. $59, $69 retail now at newegg I think, and everyone is overclocking them to 3ghz.


3 GHz on this chip is more an exception than a standard.... most people I see top out at 2.8 GHz, some are even getting stuck at 2.6 GHz.

This is a bit misleading.

What did you expect from Mike, the cold hard truth?! :lol: 
May 3, 2007 1:28:11 PM

Their stock performance is very cloce (with the E4300 winning by small percentages) but the X2 3600+ has a much better price so you'd better go with that for stock operation. When OC-ing, a 2.8-3.0GHz E4300 performs about 20% better than an equally clocked X2 3600+, so the cup here goes to the E4300.
But bare in mind that on singlethreaded apps, without OC-ing, both the E4300 and the X2 3600+ won't give you ANY visible performance increase.
May 3, 2007 1:28:22 PM

I have the 3600+ and mostly play Source Engine games. The thing whizzes through them at 1680x1050 all maxed out. I have done any map editing, so I can't comment on that or the e4300. For the price, I think the 3600 is a much better deal. On my cheap ASRock board and stock cooling I got mine to 2.6Ghz stable.

With better cooling, and better motherboard, and a slight bump in vCore I am confident that this CPU can reach 3Ghz (It posts at 3Ghz already and makes it to windows at 2.9)

I love my 3600+ and plan to keep it for a good while...
May 3, 2007 1:29:43 PM

Quote:
Well the 3600+ is obviously the best buy for the money. $59, $69 retail now at newegg I think, and everyone is overclocking them to 3ghz.


3 GHz on this chip is more an exception than a standard.... most people I see top out at 2.8 GHz, some are even getting stuck at 2.6 GHz.

This is a bit misleading.At least she/he didn't call anyone..."nerd boy"..this time. :wink:
May 3, 2007 1:30:23 PM

Quote:
It all depends on how you look at it.

The X2 3600 is 1/2 the cost of the E4300.
Yes the E4300 will not yield a computer twice as fast.

So logically from a price/performance ratio the X2 would win?

Well, may be not.

If the system price for one is $1000 and the other is $1100 due to CPU/MB price differences, then we are talking about a %10 difference in cost. In this case the E4300 would be the best choice since it easily beats the X2 3600 on a clock for clock basis and can clock higher to boot.

In the end, the E4300 is hands down better but will cost u a bit more.
If u can afford the xtra $75-$100, go for it. If not, the X2 3600 will still get u a good system.

Nope at 1600X1200 or higher you want see a %10 increase in FPS OCed or not. Its more about your GPU and memory at the higher resolutions in most all games. Oblivion is really the only 1 ive seen that needs the better CPU. For the prices you stated I could add the extra $100 and get a 8800gts and you a 8600gts so really the GPU is going to kill the CPU here.
May 3, 2007 1:36:38 PM

Quote:
It all depends on how you look at it.

The X2 3600 is 1/2 the cost of the E4300.
Yes the E4300 will not yield a computer twice as fast.

So logically from a price/performance ratio the X2 would win?

Well, may be not.

If the system price for one is $1000 and the other is $1100 due to CPU/MB price differences, then we are talking about a %10 difference in cost. In this case the E4300 would be the best choice since it easily beats the X2 3600 on a clock for clock basis and can clock higher to boot.

In the end, the E4300 is hands down better but will cost u a bit more.
If u can afford the xtra $75-$100, go for it. If not, the X2 3600 will still get u a good system.

Nope at 1600X1200 or higher you want see a %10 increase in FPS OCed or not. Its more about your GPU and memory at the higher resolutions in most all games. Oblivion is really the only 1 ive seen that needs the better CPU.

HL2 is a bit more CPU bound than many games, it already shows significant CPU scaling on an X850XT @ 1280x1024... now imagine the scaling on current cards that are 2x - 4x as fast...

http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...
http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...

I can assure you an overclocked E4300 would be at least 30% faster than an overclocked X2 3600+ with current GPUs, but its quite a moot point since both would be more than fast enough to keep the framerate above 60fps at all times. Once you get past that point it really doesn't matter how high your framerate is.
May 3, 2007 1:38:48 PM

Quote:
It all depends on how you look at it.

The X2 3600 is 1/2 the cost of the E4300.
Yes the E4300 will not yield a computer twice as fast.

So logically from a price/performance ratio the X2 would win?

Well, may be not.

If the system price for one is $1000 and the other is $1100 due to CPU/MB price differences, then we are talking about a %10 difference in cost. In this case the E4300 would be the best choice since it easily beats the X2 3600 on a clock for clock basis and can clock higher to boot.

In the end, the E4300 is hands down better but will cost u a bit more.
If u can afford the xtra $75-$100, go for it. If not, the X2 3600 will still get u a good system.

Nope at 1600X1200 or higher you want see a %10 increase in FPS OCed or not. Its more about your GPU and memory at the higher resolutions in most all games. Oblivion is really the only 1 ive seen that needs the better CPU.

HL2 is a bit more CPU bound than many games, it already shows significant CPU scaling on an X850XT @ 1280x1024... now imagine the scaling on current cards that are 2x - 4x as fast...

http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...
You may want to check pages 6 and 7 of that review as the GPU changes gets more FPS that the CPU change. In some cases above 1400mhz the CPU is flat. LOL
May 3, 2007 1:44:17 PM

Quote:
It all depends on how you look at it.

The X2 3600 is 1/2 the cost of the E4300.
Yes the E4300 will not yield a computer twice as fast.

So logically from a price/performance ratio the X2 would win?

Well, may be not.

If the system price for one is $1000 and the other is $1100 due to CPU/MB price differences, then we are talking about a %10 difference in cost. In this case the E4300 would be the best choice since it easily beats the X2 3600 on a clock for clock basis and can clock higher to boot.

In the end, the E4300 is hands down better but will cost u a bit more.
If u can afford the xtra $75-$100, go for it. If not, the X2 3600 will still get u a good system.

Nope at 1600X1200 or higher you want see a %10 increase in FPS OCed or not. Its more about your GPU and memory at the higher resolutions in most all games. Oblivion is really the only 1 ive seen that needs the better CPU.

HL2 is a bit more CPU bound than many games, it already shows significant CPU scaling on an X850XT @ 1280x1024... now imagine the scaling on current cards that are 2x - 4x as fast...

http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...
You may want to check pages 6 and 7 of that review as the GPU changes gets more FPS that the CPU change. In some cases above 1400mhz the CPU is flat. LOL

I edited my original post just as you posted. ;) 

GPUs are 2x - 4x as fast compared to 2005. CPUs have only advanced about 1.5x in that time. The fastest CPU in the article is a 2.6GHz A64 FX... which even by todays standards is not exactly 'slow' and would put a stock X2 3600+ to shame in single threaded benchmarks, hell it'll even beat low end C2Ds like the E4300.

Like I said in my earlier post, it's really a moot point because the framerate would be so high on both systems (once overclocked of course ;)  ) that I doubt anyone would be able to tell the difference.
May 3, 2007 1:52:25 PM

Boys,

Let's not start a huge argument out of it. I just want my HL2 to load quickly, like less than 10 seconds on it. My current rig is like 30 seconds on it and it's quite like forever.

Also, I'm into mapmaking, and maybe I'll be starting to do it again. Been mapmaking since 01 (HL classic) , but stopped because I had a weaker K6-2, and since then I didn't do much mapping as it took too long time to do it.

On top of that, here's my summary:

1.) I have a DDR2-667 RAM. If I plan to save some more dough (C2D is still a little bit too pricey, for even a E4300), is it good to use an X2-3600?

2.) No overclocking is done at all, since this is a central PC at home. Period.

3.) Futureproofing - I wanna play games which have dual-core support. I know the Socket 775 will be still around until next year or whatever it is, so if I buy the E4300 combo, later when these are getting cheaper, a new E6600 will be sittin' on top of it instead. But I know I can't do it on an AM2 since the AMD guys liked changing sockets which is very annoying sometimes. (no, I'm no intel or AMD fanboy - please take note.)

That's all. If loading HL2 cuts the time into half - I'm moving to that particular processor. If that E4300 loads like um... only 5 seconds, I'll pick that for sure. If that X2-3600 squeezes out a 10 second load, I'll choose that.

Opinions?
May 3, 2007 1:53:18 PM

Quote:
Well the 3600+ is obviously the best buy for the money. $59, $69 retail now at newegg I think, and everyone is overclocking them to 3ghz.


'Everyone'? How many people are running an X2 3600+ @ 3GHz in these forums, the highest I've seen is 2.8GHz. Sure, *some* people are getting 3GHz on other forums, but that is the upper limit more than the norm.

An E4300 is ~$115 retail and should overclock to 3GHz easily, 3.5GHz if you have good cooling. At such speeds it would blow away any overclocked X2. You'd need an X2 @ 4.2GHz to compete... which ain't gonna happen short of LN2.

Overall platform cost would be about $50 - $100 higher for the E4300 depending on the mobo, but don't act as if the X2 3600+ has an edge in overclocking, because it will utterly get spanked by the E4300.

As an owner of an E6300 @ 3.2GHz, you should probably know that already. If the X2 3600+ is such a great overclocker why don't you just replace your current system with one then? :lol:  :wink:

My 119 kg daddy can beat up your 59kg daddy!

Ever stop and think how dumb that sounds? probably not...
May 3, 2007 1:54:21 PM

Quote:
It all depends on how you look at it.

The X2 3600 is 1/2 the cost of the E4300.
Yes the E4300 will not yield a computer twice as fast.

So logically from a price/performance ratio the X2 would win?

Well, may be not.

If the system price for one is $1000 and the other is $1100 due to CPU/MB price differences, then we are talking about a %10 difference in cost. In this case the E4300 would be the best choice since it easily beats the X2 3600 on a clock for clock basis and can clock higher to boot.

In the end, the E4300 is hands down better but will cost u a bit more.
If u can afford the xtra $75-$100, go for it. If not, the X2 3600 will still get u a good system.

Nope at 1600X1200 or higher you want see a %10 increase in FPS OCed or not. Its more about your GPU and memory at the higher resolutions in most all games. Oblivion is really the only 1 ive seen that needs the better CPU.

HL2 is a bit more CPU bound than many games, it already shows significant CPU scaling on an X850XT @ 1280x1024... now imagine the scaling on current cards that are 2x - 4x as fast...

http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...
You may want to check pages 6 and 7 of that review as the GPU changes gets more FPS that the CPU change. In some cases above 1400mhz the CPU is flat. LOL

I edited my original post just as you posted. ;) 

GPUs are 2x - 4x as fast compared to 2005. CPUs have only advanced about 1.5x in that time. The fastest CPU in the article is a 2.6GHz A64 FX... which even by todays standards is not exactly 'slow' and would put a stock X2 3600+ to shame in single threaded benchmarks, hell it'll even beat low end C2Ds like the E4300.

Like I said in my earlier post, it's really a moot point because the framerate would be so high on both systems (once overclocked of course ;)  ) that I doubt anyone would be able to tell the difference.
So I take it your agreeing with me now? GPU's are the better upgrade for gamming and buying the cheapest CPU to get the better GPU is the right choice? CPU's in 2005 are fast enought for todays games but maybe 1 oblivion? In your review the game was CPU bound but only because is running at a low resolution and at 1600X1200 (note page 6) shows that the GPU make the bigger difference.
May 3, 2007 1:54:38 PM

Quote:

My 119 kg daddy can beat up your 59kg daddy!

Ever stop and think how dumb that sounds? probably not...


Not if my 59kg daddy is Bruce Lee. 8) :lol: 
May 3, 2007 1:58:14 PM

All I am saying is that HL2 scales more with CPU clockspeed than most games, which are generally more GPU bound.

Do you have hands on experience playing HL2 and it's spinoffs like CS:S? If you did you'll realise how CPU bound it is, especially in multiplayer, where the physics model had to be TONED DOWN by Valve because it was too taxing on CPUs.
May 3, 2007 2:02:17 PM

No doubt.
But that is why I said if you had the xtra $75-$100.

Most games are more GPU than CPU bound when using current processors and the money should be invested there first.

The same rule applies when looking at an E4300 vs an E6600 or such choices. But in these cases, the E4300 can easily be tweaked to match the power of the higher CPU. This is not the case in the x2 3600 since it's ceiling is much lower than that of the C2D chips.
May 3, 2007 2:02:49 PM

So, uh... which one is value for money?

I know it's difficult to future-proof my PC, but just need to do some transitions so that I can game my HL2 with pleasure. The long load times and map compile times are sometimes annoying, and wish to get ahead.

If I use a cheap 945 (C2D support) board, I can still use it until up to E6700, but for that cheap AM2, I never can do future upgrades.

How true are my statements? Remember, please don't hit me too much, I did my grand research before asking. :D 
May 3, 2007 2:03:09 PM

Quote:
Boys,

Let's not start a huge argument out of it. I just want my HL2 to load quickly, like less than 10 seconds on it. My current rig is like 30 seconds on it and it's quite like forever.

Also, I'm into mapmaking, and maybe I'll be starting to do it again. Been mapmaking since 01 (HL classic) , but stopped because I had a weaker K6-2, and since then I didn't do much mapping as it took too long time to do it.

On top of that, here's my summary:

1.) I have a DDR2-667 RAM. If I plan to save some more dough (C2D is still a little bit too pricey, for even a E4300), is it good to use an X2-3600?

2.) No overclocking is done at all, since this is a central PC at home. Period.

3.) Futureproofing - I wanna play games which have dual-core support.

That's all. If loading HL2 cuts the time into half - I'm moving to that particular processor. If that E4300 loads like um... only 5 seconds, I'll pick that for sure. If that X2-3600 squeezes out a 10 second load, I'll choose that.

Opinions?

1.) The X2-3600 is fast enough for any game but the E4300 would be better for map editing.
2.) OCing is safe as long as you dont push the CPU to its limit. An OC like my sig is safe and well below any heat problems as most CPUs run with no problem up to 58c. This want work on a stock cooler but even a cheap non-stock cooler with a 65nm is good for some OCing.
3.) For futureproofing you need to look at how much RAM your mobo will max out to and what GPU slot your mobo has.
May 3, 2007 2:05:10 PM

Quote:

On top of that, here's my summary:

1.) I have a DDR2-667 RAM. If I plan to save some more dough (C2D is still a little bit too pricey, for even a E4300), is it good to use an X2-3600?

2.) No overclocking is done at all, since this is a central PC at home. Period.

3.) Futureproofing - I wanna play games which have dual-core support. I know the Socket 775 will be still around until next year or whatever it is, so if I buy the E4300 combo, later when these are getting cheaper, a new E6600 will be sittin' on top of it instead. But I know I can't do it on an AM2 since the AMD guys liked changing sockets which is very annoying sometimes. (no, I'm no intel or AMD fanboy - please take note.)


Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.
May 3, 2007 2:11:16 PM

Quote:
All I am saying is that HL2 scales more with CPU clockspeed than most games, which are generally more GPU bound.

Do you have hands on experience playing HL2 and it's spinoffs like CS:S? If you did you'll realise how CPU bound it is, especially in multiplayer, where the physics model had to be TONED DOWN by Valve because it was too taxing on CPUs.

I have a son thats into CS:S heavy and we both have copys so yes I have played a bit. Note page 6 of your review where the gains slow above 1600MHz on the CPU. At the top CPU you only see a change of 20FPS but with just high end cards the change was 37FPS. If it was the case the CPU was bound on page 6 you would see almost no gains from the only slightly better GPU's. I see big differances in the the GPU's not the CPU. The only CPU bound game I know of is Oblivion. Crysis maybe but its not released yet.
http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...
May 3, 2007 2:13:36 PM

Quote:
Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.


Oh I see.

I don't know, but the X2-3600 looks limited on the other side. If I use a E4300, is it a major leap ahead from P4? I know Core architecture are way better than any Netbursts available, so the lower gigahertz rating won't matter in Cores.

As I mentioned, my RAM stick isn't DDR2-800. Just DDR2-667.

The X2 is just two Athlon64s 3000+ fused together, and one core from these couple will be just as fast as my P4 or slightly slower. Is it true?
May 3, 2007 2:18:10 PM

Quote:
Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.


Oh I see.

I don't know, but the X2-3600 looks limited on the other side. If I use a E4300, is it a major leap ahead from P4? I know Core architecture are way better than any Netbursts available, so the lower gigahertz rating won't matter in Cores.

As I mentioned, my RAM stick isn't DDR2-800. Just DDR2-667.

The X2 is just two Athlon64s 3000+ fused together, and one core from these couple will be just as fast as my P4 or slightly slower. Is it true?

If you plan on keeping your current RAM then I suggest the E4300, since C2D does not suffer nearly as much from lower spec RAM as AM2 does, plus it'll allow you to overclock to 3GHz while keeping the RAM in spec.

As many people have said already, the E4300 is faster, and overclocks better, but it comes at a cost. You'll most likely see the biggest difference in mapmaking, since I believe both CPUs are more than fast enough to run the game at 60fps+ once overclocked.

Another poster said to be wary about moving from a ~3GHz Netburst to a ~2GHz X2/C2D. This is a very good point. If you don't plan on overclocking then the X2 3600+ (and to a lesser extent the E4300) won't be a hell of a lot quicker than your existing P4. However once you overclock the chips you should see a much bigger performance gulf.
May 3, 2007 2:19:54 PM

E4300
Just better and NOT more expensive as some others have already pointed out.
May 3, 2007 2:21:19 PM

Quote:
All I am saying is that HL2 scales more with CPU clockspeed than most games, which are generally more GPU bound.

Do you have hands on experience playing HL2 and it's spinoffs like CS:S? If you did you'll realise how CPU bound it is, especially in multiplayer, where the physics model had to be TONED DOWN by Valve because it was too taxing on CPUs.

I have a son thats into CS:S heavy and we both have copys so yes I have played a bit. Note page 6 of your review where the gains slow above 1600MHz on the CPU. At the top CPU you only see a change of 20FPS but with just high end cards the change was 37FPS. If it was the case the CPU was bound on page 6 you would see almost no gains from the only slightly better GPU's. I see big differances in the the GPU's not the CPU. The only CPU bound game I know of is Oblivion. Crysis maybe but its not released yet.
http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...

You have to realise that the page you are linking to is showing significant gains in CPU scaling, especially with the top end GPU of the time (X850XT).

My point all along is that with GPUs being 2 - 4 times as powerful as the X850XT, you will see even more CPU scaling in HL2 now than 2 years ago.

But as I've said many times already, it's quite moot if an overclocked E4300 gets 200fps while the overclocked X2 3600+ gets 150fps.
May 3, 2007 2:22:49 PM

Quote:
Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.


Oh I see.

I don't know, but the X2-3600 looks limited on the other side. If I use a E4300, is it a major leap ahead from P4? I know Core architecture are way better than any Netbursts available, so the lower gigahertz rating won't matter in Cores.

As I mentioned, my RAM stick isn't DDR2-800. Just DDR2-667.

The X2 is just two Athlon64s 3000+ fused together, and one core from these couple will be just as fast as my P4 or slightly slower. Is it true?

If you plan on keeping your current RAM then I suggest the E4300, since C2D does not suffer nearly as much from lower spec RAM as AM2 does, plus it'll allow you to overclock to 3GHz while keeping the RAM in spec.

As many people have said already, the E4300 is faster, and overclocks better, but it comes at a cost. You'll most likely see the biggest difference in mapmaking, since I believe both CPUs are more than fast enough to run the game at 60fps+ once overclocked.

Another poster said to be wary about moving from a ~3GHz Netburst to a ~2GHz X2/C2D. This is a very good point. If you don't plan on overclocking then the X2 3600+ (and to a lesser extent the E4300) won't be a hell of a lot quicker than your existing P4. However once you overclock the chips you should see a much bigger performance gulf.

Thanks,

I'll pick the E4300. I know - dropping from a Netburst to a Core will be risky, but I'm futureproofing it for some more games. Who knows they are multithreaded? No overclocking is done, so I'll just leave the chip alone. And you said that the Cores work best on DDR-667, since AMD only have max performance on DDR2-800. What do you think?
May 3, 2007 2:27:12 PM

If you want to see faster load times on maps, consider a Raptor hard drive or Raid 0. The biggest leap in load times I experienced was using my Raptor and after I'd OC'd my E4300 to 2.97Ghz... In HL2 it loads the initial map in about 30-45 secs and an in-map load usually takes 15-20 seconds. Maybe a little less, never sat down and timed it precisely.
May 3, 2007 2:35:45 PM

Quote:


Thanks,

I'll pick the E4300. I know - dropping from a Netburst to a Core will be risky, but I'm futureproofing it for some more games. Who knows they are multithreaded? No overclocking is done, so I'll just leave the chip alone. And you said that the Cores work best on DDR-667, since AMD only have max performance on DDR2-800. What do you think?


Well, in your circumstances I would say it's a good choice. Being able to keep your existing RAM will cut down on costs, and you'll see more of a noticeable improvement over the P4 with the E4300.

You may have to upgrade your mobo though, if it's a 945 chipset it *may* support C2D with a BIOS upgrade, some do some don't.
May 3, 2007 2:40:48 PM

Quote:
Well the 3600+ is obviously the best buy for the money. $59, $69 retail now at newegg I think, and everyone is overclocking them to 3ghz.


'Everyone'? How many people are running an X2 3600+ @ 3GHz in these forums, the highest I've seen is 2.8GHz. Sure, *some* people are getting 3GHz on other forums, but that is the upper limit more than the norm.

An E4300 is ~$115 retail and should overclock to 3GHz easily, 3.5GHz if you have good cooling. At such speeds it would blow away any overclocked X2. You'd need an X2 @ 4.2GHz to compete... which ain't gonna happen short of LN2.

Overall platform cost would be about $50 - $100 higher for the E4300 depending on the mobo, but don't act as if the X2 3600+ has an edge in overclocking, because it will utterly get spanked by the E4300.

As an owner of an E6300 @ 3.2GHz, you should probably know that already. If the X2 3600+ is such a great overclocker why don't you just replace your current system with one then? :lol:  :wink:

By everyone I mean the majority. Check the forums.
If the chip had been out at the time I upgraded my AM2 system, I surely would have bought it.

What forums? I frequent these forums the most, and I've yet to see someone run an X2 3600+ @ 3GHz. Look at Elbert in this thread, he is running it at 2.36GHz, and while I'm aware thats a conservative overclock, please don't try to insinuate the majority of people are running X2 3600+s at 3GHz because that is clearly not the case.

You don't see people claiming E4300s do 100% 3.6GHz overclocks all the time because that is more of an UPPER LIMIT than the NORM. Same as 3GHz for the X2 3600+.
May 3, 2007 2:48:33 PM

Quote:
All I am saying is that HL2 scales more with CPU clockspeed than most games, which are generally more GPU bound.

Do you have hands on experience playing HL2 and it's spinoffs like CS:S? If you did you'll realise how CPU bound it is, especially in multiplayer, where the physics model had to be TONED DOWN by Valve because it was too taxing on CPUs.

I have a son thats into CS:S heavy and we both have copys so yes I have played a bit. Note page 6 of your review where the gains slow above 1600MHz on the CPU. At the top CPU you only see a change of 20FPS but with just high end cards the change was 37FPS. If it was the case the CPU was bound on page 6 you would see almost no gains from the only slightly better GPU's. I see big differances in the the GPU's not the CPU. The only CPU bound game I know of is Oblivion. Crysis maybe but its not released yet.
http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=...

You have to realise that the page you are linking to is showing significant gains in CPU scaling, especially with the top end GPU of the time (X850XT).

My point all along is that with GPUs being 2 - 4 times as powerful as the X850XT, you will see even more CPU scaling in HL2 now than 2 years ago.

But as I've said many times already, it's quite moot if an overclocked E4300 gets 200fps while the overclocked X2 3600+ gets 150fps.
I realize you are trying to prove a point with outdated information. You first quoted this review in poor tasted. And as stated its quite moot an X2 3600 with a 8800GTS can get higher fps than and E4300 with a 8600gts for about the same price. The Page i link show a falling off in impovements above a 1600MHz CPU and shows a large difference in even the top GPU's. Any CPU sold today can handly this game as its getting quite old and is in no way bound by any current CPU. For a game to be CPU bound the changes in GPU's would show little or no change. Even this old test shows this not to be so.

Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.
May 3, 2007 2:57:06 PM

I would not saying I'm agreeing.

I think the $75 for upgrading from an X2 to C2D is more than worth it.
If you have very minimal funds, an X2 could suffice.

I would suggest the poster spend the afternoon washing a few cars or perhaps raking leaves for a neighbor and get the funds to upgrade to the C2D.
May 3, 2007 2:59:07 PM

No game except perhaps FS-X can really be described as totally CPU bound. Even Oblivion is more GPU bound than CPU bound, but it still shows CPU scaling in certain situations, as I'm sure you're well aware.

To be fair, the HL2 engine has been upgraded to support HDR so it does tax GPUs more than it did when the article was done, but I still think it will exhibit more CPU scaling now than it did in 2005.

I guess I should clarify that by 'CPU bound' I don't mean the game is totally CPU limited, I am just saying that HL2 shows greater scaling from increased CPU speeds than many other games/game engines, new or old.

Here are some NEW HL2: Lost Coast benchmarks showing CPU scaling with the latest 8800 GPUs and X2/C2D CPUs:
http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...
http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...
May 3, 2007 3:02:56 PM

Quote:
I would not saying I'm agreeing.

I think the $75 for upgrading from an X2 to C2D is more than worth it.
If you have very minimal funds, an X2 could suffice.

I would suggest the poster spend the afternoon washing a few cars or perhaps raking leaves for a neighbor and get the funds to upgrade to the C2D.

I think the $75 would be better spent on a better GPU. IE 8800gts instead of a 8600gts. Game wise the GPU is the better choice and I wouldnt spend an extra $75 for a CPU with a 2~3 FPS increase at 1600X1200. The map editing is the only thing needing the e4300 if the OP does that a enough to warrent the $75.
May 3, 2007 3:05:50 PM

Quote:

Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.


I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.

Of course, if I had to choose between an X2 3600+/8800GTS or an E4300/8600GTS for gaming, I would surely choose the X2 3600+/8800GTS config. But an E4300/8800GTS would be a more balanced system, budget permitting of course. :wink:
May 3, 2007 3:08:08 PM

And where does the poster say he can't buy both?

Have you never gone to a restaurant and ordered both spaghetti and meatballs? You don't have to limit yourself to just spaghetti or just meatballs.

Go live life! Have both!

I don't know about you, but a $75 difference in system price is not major.
The system performance is.

If he is a poor college kid working 3 jobs to make ends meet and pay for college on his own, maybe $75 is a big deal. If he is like most folks its not. Its simply a matter of watching a movie and having dinner at home one saturday evening instead of having date night out.
May 3, 2007 3:14:12 PM

Quote:
And where does the poster say he can't buy both?

Have you never gone to a restaurant and ordered both spaghetti and meatballs? You don't have to limit yourself to just spaghetti or just meatballs.

Go live life! Have both!

I don't know about you, but a $75 difference in system price is not major.
The system performance is.

If he is a poor college kid working 3 jobs to make ends meet and pay for college on his own, maybe $75 is a big deal. If he is like most folks its not. Its simply a matter of watching a movie and having dinner at home one saturday evening instead of having date night out.


I know what you are saying. I'm building a new rig and was tempted by the X2 3600+ due to the low price. But in the end I opted for the E4300 because it only represented a ~$50 difference in system price, and since I'm spending around $1000 anyway that $50 really wasn't too big a deal for ~30% greater CPU performance after overclocking.

I may still end up with an X2 3600+ as a secondary/lounge room PC, they are just too cheap to pass up! 8)
May 3, 2007 3:15:16 PM

Quote:

Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.


I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.
Nice try but if you benchmarked the 8800gts against the 8600gts you would see more and 2~3 fps with a X2 3600. You link of even old GPU's dont show bottlenecks at 1280X1024 as the lowest on the list isnt that far of the front runner. If you look at the benchmark bound CPU's would show little if any change in FPS. Move it down to 800X600 and that test would ring true but not at 1280X1024. So your 1280x1024 being the most popular resolution amongst gamers doesnt help your review link.
May 3, 2007 3:22:11 PM

Quote:

Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.


I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.
Nice try but if you benchmarked the 8800gts against the 8600gts you would see more and 2~3 fps with a X2 3600. You link of even old GPU's dont show bottlenecks at 1280X1024 as the lowest on the list isnt that far of the front runner. If you look at the benchmark bound CPU's would show little if any change in FPS. Move it down to 800X600 and that test would ring true but not at 1280X1024. So your 1280x1024 being the most popular resolution amongst gamers doesnt help your review link.

I'm not even sure what we're arguing about anymore. Are you disputing HL2 shows greater CPU scaling than many other games?

I think any enthusiast would know an X2 3600+/8800GTS is better than an E4300/8600GTS for gaming. They would also be aware that an E4300/8800GTS would be a more balanced system in terms of CPU vs GPU power.

Reality check: A stock X2 3600+ is slower than a 3 year old A64 3200+ in single threaded gaming. It will bottleneck a 8800GTS in many cases at 1280x1024. I (and many others) happen to own a 1280x1024 LCD so while you may not find it relevant to yourself, it is relevant to a LOT of people.

I'm sorry, what are we discussing now?
May 3, 2007 3:24:16 PM

Quote:
And where does the poster say he can't buy both?

Have you never gone to a restaurant and ordered both spaghetti and meatballs? You don't have to limit yourself to just spaghetti or just meatballs.

Go live life! Have both!

I don't know about you, but a $75 difference in system price is not major.
The system performance is.

If he is a poor college kid working 3 jobs to make ends meet and pay for college on his own, maybe $75 is a big deal. If he is like most folks its not. Its simply a matter of watching a movie and having dinner at home one saturday evening instead of having date night out.
]
LOL thats about the differance in a 8600GTS and a 8800GTS 320mb. Yes the system performance is major for a GPU change in game over a CPU change.
May 3, 2007 3:28:26 PM

Quote:

Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.


I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.
Nice try but if you benchmarked the 8800gts against the 8600gts you would see more and 2~3 fps with a X2 3600. You link of even old GPU's dont show bottlenecks at 1280X1024 as the lowest on the list isnt that far of the front runner. If you look at the benchmark bound CPU's would show little if any change in FPS. Move it down to 800X600 and that test would ring true but not at 1280X1024. So your 1280x1024 being the most popular resolution amongst gamers doesnt help your review link.

I'm not even sure what we're arguing about anymore. Are you disputing HL2 shows greater CPU scaling than many other games?

I think any enthusiast would know an X2 3600+/8800GTS is better than an E4300/8600GTS for gaming. They would also be aware that an E4300/8800GTS would be a more balanced system in terms of CPU vs GPU power.

Reality check: A stock X2 3600+ is slower than a 3 year old A64 3200+ in single threaded gaming. It will bottleneck a 8800GTS in many cases at 1280x1024. I (and many others) happen to own a 1280x1024 LCD so while you may not find it relevant to yourself, it is relevant to a LOT of people.

I'm sorry, what are we discussing now?
CPU bound propertys of HL2 and system price differances. Come on Epsilon84 try and keep up. If you need me to agree with you find an Oblivion benchmark with X2 3600 and ill agree its bottlenecked. Now the reality check is even a 3 year old A64 didnt show to be CPU bound. I use a 1280X1024 CRT and see no CPU bounds unless I play Oblivion.
May 3, 2007 3:32:55 PM

Quote:

LOL thats about the differance in a 8600GTS and a 8800GTS 320mb. Yes the system performance is major for a GPU change in game over a CPU change.


I'm not quite sure why you are so obsessed with the 8800GTS vs 8600GTS comparison.

You seem hell bent on comparing a X2 3600+ vs E4300 system at the same pricepoint. Why is that? I think it's clear to everyone that for a comparable config to an X2 platform, the C2D will cost $50 - $100 more. With that comes greater CPU performance. It's really quite simple, I'm quite amazed we are still discussing this topic.
May 3, 2007 3:37:00 PM

Quote:

LOL thats about the differance in a 8600GTS and a 8800GTS 320mb. Yes the system performance is major for a GPU change in game over a CPU change.


I'm not quite sure why you are so obsessed with the 8800GTS vs 8600GTS comparison.

You seem hell bent on comparing a X2 3600+ vs E4300 system at the same pricepoint. Why is that? I think it's clear to everyone that for a comparable config to an X2 platform, the C2D will cost $50 - $100 more. With that comes greater CPU performance. It's really quite simple, I'm quite amazed we are still discussing this topic.
Simple its about the same price differance of the X2 3600 and the E4300. God Epsilon84 you fall behind fast. Now is that $100 price worth 2~3 FPS? I say no and the OP should only base the CPU choice on map making.
May 3, 2007 3:37:10 PM

Quote:
Come on Epsilon84 try and keep up. If you need me to agree with you find an Oblivion benchmark with X2 3600 and ill agree its bottlenecked. Now the reality check is even a 3 year old A64 didnt show to be CPU bound. I use a 1280X1024 CRT and see no CPU bounds unless I play Oblivion.


It's funny you say you are CPU bound in Oblivion, since it is far more GPU bound than HL2 ever was (or is). Are you sure you're not exceeding the 320MB memory with excessive AA?

Oblivion CPU scaling:
http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...
http://www.firingsquad.com/hardware/geforce_8800_gtx_gt...

I don't see much CPU bottlenecking on a 8800GTS I'm afraid. It's only evident on the 8800GTX.
!