Sign in with
Sign up | Sign in
Your question

Phenom II X6 vs Phenom II X4

Last response: in CPUs
Share
November 4, 2012 1:38:03 AM

witch cpu would be better for gaming/every day performance?

More about : phenom phenom

November 4, 2012 1:44:23 AM

from my experience i would recommend going with phenom II x4. just look for the one with the highest clock.
the deal here is that after a certain point extra cores give less and less extra performance and i don't know of any games that can make good use of more than 4 cores.
I used to have a phenom II x4 965 3.4ghz and it was really good, had it OC to 4.1ghz, guess there are other models now with a higher default clock
November 4, 2012 1:46:38 AM

i thought that bf3 used more than 4 cores while playing online
Related resources
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 1:58:42 AM

corymartin66 said:
i thought that bf3 used more than 4 cores while playing online

It does, but its one of the very few games that do. If you're planning on sticking one with your 550 TI, the 550 TI is going to be the limiting factor in performance in pretty much any game. I'd just grab a P II 965 Black Edition, or even maybe go with an FX-6300 if you still want a 6 core.

Then again, that motherboard you have isn't AM3+ is it?
November 4, 2012 2:02:19 AM

hmm, maybe it does, not really sure, but again after a certain number of cores it doesn't make sense have more, an easy way to see that is how a 2 core processor will not have double the speed of a 1 core processor running at the same clock, and each core added gives less and less extra performance.
the problem now days is because since around 2004 we hit a wall when it comes to making processors with a higher clock speed, we don't know how to make it yet and is the clock that will really give a big boost in performance.
because of that that we don't have processors now days running at 10-20ghz :( 

If you have any interest in the subject (doubt that) and want to read more here's a link

http://www.gotw.ca/publications/concurrency-ddj.htm
November 4, 2012 2:02:26 AM

The 1100 x6 was and still is one the best AMD cpu's overall. Only the newest Piledriver can match it.
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 2:08:22 AM

Quote:
hmm, maybe it does, not really sure,


BF3 is known to use all 8 cores on an FX-8. The rumor mill even says it treats Intel's HyperThreads as if they were cores. If thats true, its the only game on the market that can do it.

Quote:
but again after a certain number of cores it doesn't make sense have more, an easy way to see that is how a 2 core processor will not have double the speed of a 1 core processor running at the same clock, and each core added gives less and less extra performanc


Depends on what kinds of applications you're running. Servers can have processors with dozens of cores in them. Major overkill for the average "daily user" or gamer.
November 4, 2012 2:12:35 AM

nekulturny said:
Depends on what kinds of applications you're running. Servers can have processors with dozens of cores in them. Major overkill for the average "daily user" or gamer.


Yeah i was expecting him to only make use of his pc for gaming and maybe video encoding, if that.
November 4, 2012 2:12:37 AM

it has to be a 95w and to me my gtx 550ti is a good card it runs my games just fine
November 4, 2012 2:13:25 AM

corymartin66 said:
it has to be a 95w and to me my gtx 550ti is a good card it runs my games just fine


While i had processors that required a higher wattage than the motherboard could provide running just fine i wouldn't recommend.
November 4, 2012 2:14:29 AM

just gaming.....bf3 for the most part
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 2:17:31 AM

corymartin66 said:
just gaming.....bf3 for the most part

I had a 550 TI before I dropped the 7870 in, I'm quite familiar with its capabilities. Its a fine card for mid-level gaming. If you're limited to a 95 watt TDP on that board, I wouldn't bother upgrading the CPU. The 95 watt TDP Phenom IIs won't be much of an upgrade over your Athlon tri-core. And no, I wouldn't drop a 125 watt TDP CPU into it, it may not work at all, and if it does, you may find yourself wondering what that burning smell is coming from your case.
a b à CPUs
November 4, 2012 3:38:26 AM

you would be better off getting a new board and a phenom ii 970. A 980 isnt really worth it because of the price and besides the 970 can be OCed to 4.2 GHZ with no problems.
A quad core at a higher clock speed would be much better than a hexa core at a slower clock
November 4, 2012 4:14:36 AM

jaideep1337 said:
you would be better off getting a new board and a phenom ii 970. A 980 isnt really worth it because of the price and besides the 970 can be OCed to 4.2 GHZ with no problems.
A quad core at a higher clock speed would be much better than a hexa core at a slower clock


Honestly if he were to get a new motherboard i would suggest him to get one for a intel processor since these last few generations intel processors been lot better unfortunately, so i would suggest a intel i7 2000 series or 3000 series, just watch for the socket so you buy the proper motherboard.
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 4:50:03 AM

With a 550 TI video card, there is no need to buy an Intel i7 for gaming.... Theres no need to buy an i7 for a high end gaming system with crossfire 7970s or SLi GTX 680s.... Stronger CPUs cannot make weaker video cards perform better in gaming. A 550 TI, while a decent entry-mid level graphics card is not up to snuff for HD 1080p and above gaming, so the point is moot.

i7s have HyperThreading, thats the only thing i7s have that i5s don't. You don't need it to max out any game with high end graphics cards, and you never will. HyperThreading has been around for several years and generations of Intel CPUs, its yet to bring anything necessary or greatly advantageous to the gamers. Most games (with the exception of the rumor that BF3 can use HyperThreads), games are written in the same primitive programming languages they were written in 15 years ago, as such they don't know what HyperThreads are much less how to use one.
November 4, 2012 4:53:40 AM

nekulturny said:
With a 550 TI video card, there is no need to buy an Intel i7 for gaming.... Theres no need to buy an i7 for a high end gaming system with crossfire 7970s or SLi GTX 680s.... Stronger CPUs cannot make weaker video cards perform better in gaming. A 550 TI, while a decent entry-mid level graphics card is not up to snuff for HD 1080p and above gaming, so the point is moot.

i7s have HyperThreading, thats the only thing i7s have that i5s don't. You don't need it to max out any game with high end graphics cards, and you never will. HyperThreading has been around for several years and generations of Intel CPUs, its yet to bring anything necessary or greatly advantageous to the gamers.


Indeed, thanks to point that out, had forgotten which GPU his using, BUT i might point out to you that a i7 3820 made a huge boost on my 7970 CF system coming from a phenom 965, on average about 40-50% faster for gaming at 1080p

HyperThreading is a good for me as i need it for video encoding.

But i must disagree, if most games were that primitive to 15 year old coding than none of them would make use of multi-core processors and would perform in a terrible way. HT being a whole other thing that most games indeed don't make use of.

for you i really feel you should read this

http://www.gotw.ca/publications/concurrency-ddj.htm
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 4:56:02 AM

SharperZeroCool said:
Indeed, thanks to point that out, had forgotten which GPU his using, BUT i might point out to you that a i7 3820 made a huge boost on my 7970 CF system coming from a phenom 965, on average about 40-50% faster for gaming at 1080p

HyperThreading is a good for me as i need it for video encoding.

Well yea, lol.. I expect it would, 7970 CF is pretty high end stuff, if you're spending $700-900 on dual graphics cards of that caliber, you absolutely should be looking at high end CPUs. Thats some serious hardware.

Quote:

But i must disagree, if most games were that primitive to 15 year old coding than none of them would make use of multi-core processors and would perform in a terrible way. HT being a whole other thing that most games indeed don't make use of.


I will read that article, I bookmarked it, but my brain is somewhat exhausted to devote proper attention to it atm, so forgive me if I'm "skipping over" things. But it certainly is possible for the games to make use of multi-core processors. Most games use one core heavily, and a 2nd core "as needed". Games are coded in such a way that the processes are manually distributed amongst the other cores, unlike what you see with more modern programming languages. And, I'm somewhat of a noob to programming, (its too much blah blah blah even for my nerdy techie brain to follow), so forgive me if my terminology is off.
November 4, 2012 4:58:38 AM

nekulturny said:
Well yea, lol.. I expect it would, 7970 CF is pretty high end stuff, if you're spending $700-900 on dual graphics cards of that caliber, you absolutely should be looking at high end CPUs. Thats some serious hardware.

Than why you said this ?
"Theres no need to buy an i7 for a high end gaming system with crossfire 7970s or SLi GTX 680s.... Stronger CPUs cannot make weaker video cards perform better in gaming."
Doesn't matter, that will be deviating from the topic at hand.
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 5:09:17 AM

SharperZeroCool said:
Than why you said this ?
"Theres no need to buy an i7 for a high end gaming system with crossfire 7970s or SLi GTX 680s.... Stronger CPUs cannot make weaker video cards perform better in gaming."
Doesn't matter, that will be deviating from the topic at hand.

Because, an i5 will still do it, for gaming. Honestly, a double shot of 7970 is overkill if the highest resolution you're shooting for is 1920x1080. Unless you like the big FPS rates, although unless you have a 120hz monitor (and I hope you do), anything above 60FPS stable is useless. I'm not sure what you mean by a 40% improvement, to be honest, my reply was more a way of being polite rather than saying something that might pick a fight. But if by 40 percent you mean you went from 60 FPS on a game to 100FPS on a game and you have a 60hz monitor, your performance bump is merely benchmarkable, not really related to actual game performance.
November 4, 2012 5:13:00 AM

nekulturny said:
Because, an i5 will still do it, for gaming. Honestly, a double shot of 7970 is overkill if the highest resolution you're shooting for is 1920x1080. Unless you like the big FPS rates, although unless you have a 120hz monitor (and I hope you do), anything above 60FPS stable is useless. I'm not sure what you mean by a 40% improvement, to be honest, my reply was more a way of being polite rather than saying something that might pick a fight. But if by 40 percent you mean you went from 60 FPS on a game to 100FPS on a game and you have a 60hz monitor, your performance bump is merely benchmarkable, not really related to actual game performance.


Yes i do have a 120hz monitor and he's part of my problem, something around a 120hz monitor and a 60hz monitor together with this CF is causing games to freeze with v-sync on, pain in the ass, think its a firmware glitch.

the 40% improvement were had in benchmark programs as 3Dmark. Probably some increase in games as well but i don't really pay attention to that as long as they run above 60fps, anything below that and it gets hard for me to play.

I do not know much about the intel i5 processors, don't know much about any intel processors really, performance-wise, just recently started using one since the performance on the AMD Bulldozer kinda sucked from my point of view, but you might be right, a i5 on average might be just as good for gaming than a i7 is.
November 4, 2012 5:23:10 AM

nekulturny said:
Well yea, lol.. I expect it would, 7970 CF is pretty high end stuff, if you're spending $700-900 on dual graphics cards of that caliber, you absolutely should be looking at high end CPUs. Thats some serious hardware.

Quote:

But i must disagree, if most games were that primitive to 15 year old coding than none of them would make use of multi-core processors and would perform in a terrible way. HT being a whole other thing that most games indeed don't make use of.


I will read that article, I bookmarked it, but my brain is somewhat exhausted to devote proper attention to it atm, so forgive me if I'm "skipping over" things. But it certainly is possible for the games to make use of multi-core processors. Most games use one core heavily, and a 2nd core "as needed". Games are coded in such a way that the processes are manually distributed amongst the other cores, unlike what you see with more modern programming languages. And, I'm somewhat of a noob to programming, (its too much blah blah blah even for my nerdy techie brain to follow), so forgive me if my terminology is off.


Yes games now days can make use of multiple cores, games from 2003-ish and before can't, they weren't coded to make use of the extra cores. so that's the point of that article, the free ride that most programs and games had from just depending on the increase of power on the cpu was over, from that point if a program were to have a better performance coding would be required to adapt it to multiple core usage.
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 5:26:17 AM

Bulldozer kinda sucked from everyone's point of view :lol: . PileDriver is what Bulldozer should have been, yes it still suffers in individual core performance, but thats really becoming less of an issue going forward.

Yea, I'd imagine you would get a 40 percent improvement from a Synthetic bench. The i5s really aren't any different from i7s. Its just the HyperThreading. HT "pretends" to be cores, mainly to aid in removing bottlenecks.

It does have a small impact on gaming performance, but not directly. Because it does what its meant to do, it indirectly impacts gaming performance (although a very slight 1-2% on average) because it handles the some of the workload from the background programs running on your machine while you're gaming.

V-sync is not something I'm too familiar with either, if I understand it correctly, its meant to attempt to force the graphics card to match the refresh rate of the monitor. While it may help in some cases, other times that may actually make things worse if the video card wants to put out less frames per second than the monitor is capable of doing. Or maybe I have it backwards? Did I mention its late? lol
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 5:31:06 AM

SharperZeroCool said:
Yes games now days can make use of multiple cores, games from 2003-ish and before can't, they weren't coded to make use of the extra cores. so that's the point of that article, the free ride that most programs and games had from just depending on the increase of power on the cpu was over, from that point if a program were to have a better performance coding would be required to adapt it to multiple core usage.

Well, and like I said, I will read that article. I keep my promises lol. But, I think the problem is with games, is theres a lot of things that game developers can do in regards to PC gaming, but they won't do because its not profitable. If game makers used the more modern programming languages that let all cores be fully utilized, then you run into issues that all but the most powerful (and expensive) of video cards can really keep up with. PC gamers already enjoy the potential of much better resolution and detail that console gamers can only dream of, still they have much more potential even today. Battlefield 3 I think was pretty bold in making a game well known to tax even high end PCs. Luckily, its still also scalable so lower end machines can enjoy playability.

The problem that lies therein is the fact that consoles are far cheaper than high end PCs like you have (yes I looked at your member configuration page). Not many people are going to invest that kind of money in a computer system, when they can go out and drop $300 on an Xbox 360. Blame the consoles for game programmers being so archaic. It is them who dictate the market trends.
November 4, 2012 5:31:23 AM

nekulturny said:
Bulldozer kinda sucked from everyone's point of view :lol: . PileDriver is what Bulldozer should have been, yes it still suffers in individual core performance, but thats really becoming less of an issue going forward.

Yea, I'd imagine you would get a 40 percent improvement from a Synthetic bench. The i5s really aren't any different from i7s. Its just the HyperThreading. HT "pretends" to be cores, mainly to aid in removing bottlenecks.

It does have a small impact on gaming performance, but not directly. Because it does what its meant to do, it indirectly impacts gaming performance (although a very slight 1-2% on average) because it handles the some of the workload from the background programs running on your machine while you're gaming.

V-sync is not something I'm too familiar with either, if I understand it correctly, its meant to attempt to force the graphics card to match the refresh rate of the monitor. While it may help in some cases, other times that may actually make things worse if the video card wants to put out less frames per second than the monitor is capable of doing. Or maybe I have it backwards? Did I mention its late? lol


Yes you're absolutely right for the most part, is not that the video card wants to pull out less frames per second than the monitor, is that the video card can't in some cases i.e. running at less than the monitor refresh rate.
Yes its late, 5:30AM here.
November 4, 2012 5:38:47 AM

nekulturny said:
Well, and like I said, I will read that article. I keep my promises lol. But, I think the problem is with games, is theres a lot of things that game developers can do in regards to PC gaming, but they won't do because its not profitable. If game makers used the more modern programming languages that let all cores be fully utilized, then you run into issues that all but the most powerful (and expensive) of video cards can really keep up with. PC gamers already enjoy the potential of much better resolution and detail that console gamers can only dream of, still they have much more potential even today. Battlefield 3 I think was pretty bold in making a game well known to tax even high end PCs. Luckily, its still also scalable so lower end machines can enjoy playability.

The problem that lies therein is the fact that consoles are far cheaper than high end PCs like you have (yes I looked at your member configuration page). Not many people are going to invest that kind of money in a computer system, when they can go out and drop $300 on an Xbox 360. Blame the consoles for game programmers being so archaic. It is them who dictate the market trends.


Yes i know, damn consoles :p 
but also blame CPU manufacturers for not spending more time trying to break the 4-5 ghz limit and instead spending so much time putting in features for overclocking where if we break that limit we wouldn't need to overclock. Those huge aircoolers like my old Thermalright 120 ultra and even watercoolers are just showing how all of this is ridiculous and we are going in the wrong direction.
November 4, 2012 2:18:42 PM

hafijur said:
2 cores with same architecture of the 1 core gives double the performance if used 100%.

These days top end cpu's are equivalent to a p4 running at 50ghz even if they are oc'd to 4.5ghz.

Also we have these days 4.5-5ghz after oc on i7 quads. Its actually amazing the speed difference from a p4 3.8ghz to a top end cpu today and they both take same power watts wise.


Yes because is not only cores and clock that affect a cpu performance, the architecture is much better but believe me you don't get much improvement if an processor had double the cores, 2 to 4, 4 to 8 or whatever, you get some but it all really depends on how well the program is coded to use the extra cores, sometimes thats the biggest slowdown. also it depends on the chip that controls all the cores together. so no you won't see double the performance on a same chip with double cores as you said, most of the time is the coders fault but is how it is.
a c 78 à CPUs
a b 4 Gaming
November 4, 2012 2:20:49 PM

Clockspeeds aren't everything. Intel tried to get their clocks up high with their Netburst architecture. AMD embarrassed them badly at the time with superior performing, lower clocked CPUs in the form of the original Athlon. You think Bulldozer is an embarrassment for AMD? Its got nothing on Netburst. AMD has always been the underdog in the 2 dog race, they're expected to fumble more often.
November 4, 2012 2:28:25 PM

nekulturny said:
Clockspeeds aren't everything. Intel tried to get their clocks up high with their Netburst architecture. AMD embarrassed them badly at the time with superior performing, lower clocked CPUs in the form of the original Athlon. You think Bulldozer is an embarrassment for AMD? Its got nothing on Netburst. AMD has always been the underdog in the 2 dog race, they're expected to fumble more often.

Yes clock speeds aren't everything but they are the big dog, is how much operations a processor can make per second and the boost a higher clock can give is huge.
November 4, 2012 3:18:38 PM

hafijur said:
Look at the benchmarks, generally double the score if 2 cores added at same clock speed.

Also on 100% cpu video encoding it should give 2x performance if lets say 2.4ghz core 2 duo e6600 over a q6600 at 2.4ghz.

My whole point is that the more cores you add the less performance it gives, with 2 you get a great extra performance but again not usually double, it all depends on how well the program was coded plus a few other things, after that a third or fourth core wont really keep giving huge improvements as the second one did. again the best performance boost is clock,something that at this point can't pass the 4ghz mark without ridiculous cooling techniques and just adding extra power into it, again remember that intel had expected a 4ghz processor by 2005 and just now we are hitting that mark, if we manage to break this wall we will see a huge boost in performance.
intel had expected to make 10ghz processors by 2010, imagine that in a 2-4 core processor!
November 5, 2012 5:41:13 AM

hafijur said:
They realised they couldn't and adding more cores was the way to go. in essence a 4ghz i7 is similar performance to 16ghz total and 4x that over pentium 4 architecture ghz.

Either way I don't think you understand about the difference in cpu;s now compared back to the p4 era. Pentium m cpu's were embarrassing p4 and core 2 duo 65nm vs p4 65nm showed how big architecture can do. Now we are at ivy bridge with increased fuctions and architecture and 22nm allows for more clock speed and oc andd more cores. Before it was not possible to get dual core as it would take to much power to do unless really low clocked and now intel and amd researched it toms of dual core systems came out in 2005/2006. Either way a 3.8ghz p4 ht taking 24 hours to do a task vs lets say an i7 at 3.8ghz 4c 8t will take around 1 hour to do. So in essence effectively its like 80ghz+ in performance to a p4.

Anyway performance is the main thing and performance per watt.

Pentium m and athlon 64 showed that back in 2004 as a p4 looked like an ancient cpu even at 3.8ghz performance per watt then.

At this point i'm tired of this argument

All i can do is recommend for you to read this - http://www.gotw.ca/publications/concurrency-ddj.htm

or at least this next part i will paste here

Myths and Realities: 2 x 3GHz < 6 GHz
So a dual-core CPU that combines two 3GHz cores practically offers 6GHz of processing power. Right?
Wrong. Even having two threads running on two physical processors doesn’t mean getting two times the performance. Similarly, most multi-threaded applications won’t run twice as fast on a dual-core box. They should run faster than on a single-core CPU; the performance gain just isn’t linear, that’s all.
Why not? First, there is coordination overhead between the cores to ensure cache coherency (a consistent view of cache, and of main memory) and to perform other handshaking. Today, a two- or four-processor machine isn’t really two or four times as fast as a single CPU even for multi-threaded applications. The problem remains essentially the same even when the CPUs in question sit on the same die.
Second, unless the two cores are running different processes, or different threads of a single process that are well-written to run independently and almost never wait for each other, they won’t be well utilized. (Despite this, I will speculate that today’s single-threaded applications as actually used in the field could actually see a performance boost for most users by going to a dual-core chip, not because the extra core is actually doing anything useful, but because it is running the adware and spyware that infest many users’ systems and are otherwise slowing down the single CPU that user has today. I leave it up to you to decide whether adding a CPU to run your spyware is the best solution to that problem.)
If you’re running a single-threaded application, then the application can only make use of one core. There should be some speedup as the operating system and the application can run on separate cores, but typically the OS isn’t going to be maxing out the CPU anyway so one of the cores will be mostly idle. (Again, the spyware can share the OS’s core most of the time.)
November 5, 2012 6:37:57 AM

power per watt is great for laptops, but if AMD created a processor needing minimum 1000w PSU but out performed the i7, we would all have all have 1000w PSU
November 5, 2012 6:48:22 AM

abbadon_34 said:
power per watt is great for laptops, but if AMD created a processor needing minimum 1000w PSU but out performed the i7, we would all have all have 1000w PSU

Did you know that processors can be made of a diamond instead of silicon ? would help them stand higher temperatures up to a couple of hundred degrees Celsius instead of the usual 60-90 limit but we just don't make them because is not good business, just like adding more and more power. Also the more power you add the more heat you have to deal with, we are already at the point where a massive aircooler or watercooler is required for proper heat dissipation and if you were to have a processor using 1000w of power that thing would require a liquid nitrogen cooling system to work properly. Is not by adding more power that we will solve our problem with processors now days, neither is by adding a ridiculous amount of extra cores, we need to find the proper material or way to make a processor with a higher clock and break the 4ghz wall we are at.
November 5, 2012 7:08:36 AM

You are getting too specific. What I'm saying is ALL OTHER THING BEING EQUAL, wattage is a "red herring". If you could flip and switch and suddenly the AMD left intel in dust, without changing anything, but the power quadrupled, wouldn't you? Is anyone thinking of wattage when overclocking? If you oc a amd to 10ghz be simple buying a new power supply, don't you think that would change the game? I know none of this is true, just trying make a point about enthusiasts, which most here are at heart.
November 5, 2012 7:59:19 AM

abbadon_34 said:
You are getting too specific. What I'm saying is ALL OTHER THING BEING EQUAL, wattage is a "red herring". If you could flip and switch and suddenly the AMD left intel in dust, without changing anything, but the power quadrupled, wouldn't you? Is anyone thinking of wattage when overclocking? If you oc a amd to 10ghz be simple buying a new power supply, don't you think that would change the game? I know none of this is true, just trying make a point about enthusiasts, which most here are at heart.

No i wouldn't because that would also burn the cpu, it wouldn't be able to handle all that heat. that's not how this works, is not by adding more power and heat that we will hit higher clocks, is by the manufacturers finding the way to get there with low power usage and less heat because as stands right now the laws of physics tells us that every power source will dissipate heat and unfortunately we can't have the perfect cpu where 100% the power put into it is used for its performance and none lost in heat, that's what all that heat is, power not well used but that's actually impossible to exist :( 

I get where you're coming from here but i also don't see why you think a AMD would benefit with this extra power and intel wouldn't, of course a AMD overclocked to 10ghz would leave a standard intel at 3-4ghz in the dust but if you overclock them both at 10ghz i honestly am not sure what would happen, depends on the processor being compared i guess, from the last time i checked AMD and intel processors intel's top 3960k would wreck AMD top bulldozer but i seems AMD is out now with a newline of processors that are not terrible like Bulldozer were.
Also don't get me wrong here i'm not a intel fanboy, i'm no fanboy at all, i just go for the best i can get at a certain moment, in the pass i used both AMD and Nvidia graphics card as well as Intel and AMD processors but now i'm with a intel and AMD video card.
November 5, 2012 9:43:47 AM

hafijur said:
sharpzerocool: You only have to look at pentium d from pentium 4 or core 2 duo to core 2 quad at same clock to see that the benchmark scores virtually double when used to the extreme. It has to be from the same architecture so a 65nm core 2 duo to a 65nm core 2 quad etc. Anyway that fritz chess benchmark shows on some scale how much faster these days cpu's are then a pentium 3 at 1ghz. Some are 60x faster but they are not 60ghz but in effect perform the equivalent to a 60ghz p3 if it existed.

Yes because other things have changed on today cpu's allowing them to be faster, is not just multiple cores and clock you know !
Anyway i spoke enough on this thread and anything i can say now will just be redundant, if you read my posts and still think that than believe whatever you want to believe.

For the last time i will just leave this here as i feel you can be enlightened with it - http://www.gotw.ca/publications/concurrency-ddj.htm it was written in 2005 and has been proven right until today.
!