Sign in with
Sign up | Sign in
Your question

AMD Phenom With B3 Stepping: First Look

Last response: in CPUs
Share
March 18, 2008 10:23:28 AM

According to our tests, this CPU is based on highly anticipated B3 stepping technology. And, it's meant to be error-free, allow higher clock speeds and still consume less power than AMD's Phenom 9300 and 9400 models.

http://www.tomshardware.com/2008/03/18/amd_phenom/index.html

More about : amd phenom stepping

March 18, 2008 11:08:38 AM

hold your horses. As the article mentions, this is a "First Look".
Wait a couple o'weeks until a full review and then we'll see.
March 18, 2008 11:11:44 AM

Can't wait for B3!
If there is a 9950, count me in! :D 
Related resources
March 18, 2008 12:07:23 PM

muk said:
According to our tests, this CPU is based on highly anticipated B3 stepping technology. And, it's meant to be error-free, allow higher clock speeds and still consume less power than AMD's Phenom 9300 and 9400 models.

http://www.tomshardware.com/2008/03/18/amd_phenom/index.html


#1) It fixes the Errata.
#2) It does not OC any more.
#3) It does not perform any faster clock for clock.
#4) It does not use any more power.

Gotta what to the Fall for the 45nm versions for those hopes.
a c 127 à CPUs
a b À AMD
March 18, 2008 12:09:52 PM

So from the article I am guessing its just a Phenom 9700 just TLB errata free?
March 18, 2008 12:16:25 PM

I just can't remember

did AMD claim it would give added performance or only people on forum wished it???
March 18, 2008 12:18:51 PM

How do we know it doesn't OC any more or perform faster?
Certainly not from this article that's basicly "You know AMD had errata, they're gone in this new revision and it fits into a MSI K9A2 Platinium without a bios update."

We still don't know if AMD have done anything else to the Phenom in Revision 3, than getting rid of the errata. They might just have tweaked it so it performs better than the second revision in one or more ways, hell it might even be a decent overclocker. We just can't tell from the article.

However it's easter so I forgive Toms Hardware for this severely lacking article that's hardly interesting for anyone but owners of a MSI K9A2 Platinium mottherboard.
a c 127 à CPUs
a b À AMD
March 18, 2008 12:21:45 PM

justjc said:
How do we know it doesn't OC any more or perform faster?
Certainly not from this article that's basicly "You know AMD had errata, they're gone in this new revision and it fits into a MSI K9A2 Platinium without a bios update."

We still don't know if AMD have done anything else to the Phenom in Revision 3, than getting rid of the errata. They might just have tweaked it so it performs better than the second revision in one or more ways, hell it might even be a decent overclocker. We just can't tell from the article.

However it's easter so I forgive Toms Hardware for this severely lacking article that's hardly interesting for anyone but owners of a MSI K9A2 Platinium mottherboard.


Did you read the whole thing? They state they will have a full batch of test in a review soon. Anand had the same thing yesterday so its not just THG. Next few days will be interesting.
a b à CPUs
March 18, 2008 12:30:48 PM

Finally ...

March 18, 2008 1:22:45 PM

I wonder, if we all think together that the B3 phenom will perform and OC much better, if it will really happen. Put those possitive brain waves into good use...
March 18, 2008 1:23:37 PM

zenmaster said:
#1) It fixes the Errata.
#2) It does not OC any more.
#3) It does not perform any faster clock for clock.
#4) It does not use any more power.

Gotta what to the Fall for the 45nm versions for those hopes.


Well we don't know if it OC's further and we don't know for sure if it uses the same /more/less power....put we should expect the same.

What we do know is that the TLB "bug" which has only showed up in a -lab- so far is fixed.
To be honest this "bug" has never been seen outside of a -lab-.....so good call for AMD fixing the -bug- anyhow after first recalling past shipments.

Perhaps near end of the year they will ship a 3Ghz product.
No matter the maker in real life after you hit 3Ghz only a benchmark can tell if a game runs any better...and almost all software is still P4 rated.
March 18, 2008 1:27:32 PM

I just want all the damn prices to go down on all the chips period. Intel needs some competion hopefully this brings down the chip costs all around.
March 18, 2008 1:39:18 PM

mrgoodbar said:
I just want all the damn prices to go down on all the chips period. Intel need some competion hopefully this bring down the chip costs.

I am sure it will.
Right now a game running with a AMD 6000+ @ $111 USD runs just as well as far as the player can tell vrs a top of the line Intel quad overclocked @ $1,100 USD.

Benchmarks can see a gain...but the game player or human eye can't.
Game and office software is still using the P4 rateings.
a c 127 à CPUs
a b À AMD
March 18, 2008 1:40:48 PM

ZOldDude said:
Well we don't know if it OC's further and we don't know for sure if it uses the same /more/less power....put we should expect the same.

What we do know is that the TLB "bug" which has only showed up in a -lab- so far is fixed.
To be honest this "bug" has never been seen outside of a -lab-.....so good call for AMD fixing the -bug- anyhow after first recalling past shipments.

Perhaps near end of the year they will ship a 3Ghz product.
No matter the maker in real life after you hit 3Ghz only a benchmark can tell if a game runs any better...and almost all software is still P4 rated.


Thing about the OC'ing is we don't know that the TLB Bug was causing that. It might just be the architecture or the process(65nm SOI).

And for some games such as Crysis, even after 3GHz it helps. And a lot of games after that will probably be the same. But of course thats just one game and you are right that after 3GHz on most games only a benchmark will tell. I ran the CS:Source stress test twice and got 140-160FPS average. But to me it just looked normal until I saw the results.
March 18, 2008 1:44:46 PM

ZOldDude said:
I am sure it will.
Right now a game running with a AMD 6000+ @ $111 USD runs just as well as far a the player can tell vrs a top of the line Intel quad overclocked @ $1,100 USD.

Benchmarks can see a gain...but the game player or human eye can't.
Game and office software is still using the P4 rateings.



^^ Did you hear that the new 5600+ will be a black edition unlocked multiplier at 65w instead of 89w. Might be nice, they are releasing this, the phenoms b3, and the Tri Cores soon. Currently Im on an old p4 I sold my 6400+ waiting on these phenom chips. I have waited so long, so dang long.
March 18, 2008 1:45:25 PM

Nothing is ever going to make Crysis run better/faster....it was written for the top end hardware when it was written.

Don't spend a dime thinking it will run better if you already have any brand CPU @ 3Ghz (1-500 cores) or a 8800GTX.

Most people who just play games or run home GFX/Office software would have to be "less than smart" to pay -ten times- the price for a CPU if they could never "see" a gain.
a c 127 à CPUs
a b À AMD
March 18, 2008 1:49:12 PM

ZOldDude said:
Nothing is ever going to make Crysis run better/faster....it was written for the top end hardware when it was written.

Don't spend a dime thinking it will run better if you already have any brand CPU @ 3Ghz (1-500 cores) or a 8800GTX.


No its more like Doom 3/HL2. They were both released and could run on current hardware but performed the best on hardware yet to be released. I am sure that a R770 or G100 will be able to tame Crysis. But the CPU with more IPC and at least 3GHz will help for now.
March 18, 2008 1:58:43 PM

jimmysmitty said:
No its more like Doom 3/HL2. They were both released and could run on current hardware but performed the best on hardware yet to be released. I am sure that a R770 or G100 will be able to tame Crysis. But the CPU with more IPC and at least 3GHz will help for now.

Jimbo...I edited my post above yours while you submited yours.
By the way....Crysis is not a "problem" for almost everyone to run/play.

The idea that Cysis runs like a P.O.S. untill you have the latest hardware which did not exist when it was written is the BS of hardware salers and nothing more...nor will it ever be.

a c 127 à CPUs
a b À AMD
March 18, 2008 2:10:44 PM

ZOldDude said:
Jimbo...I edited my post above yours while you submited yours.
By the way....Crysis is not a "problem" for almost everyone to run/play.


What I mean by "tame" is not so much a problem. But being able to run it at a decent res(such as 1680x1050) with everything set to very high and AA/AF. Right now I have yet to see one that can play at those settings while getting 30+FPS average.
March 18, 2008 2:12:27 PM

jimmysmitty said:
What I mean by "tame" is not so much a problem. But being able to run it at a decent res(such as 1680x1050) with everything set to very high and AA/AF. Right now I have yet to see one that can play at those settings while getting 30+FPS average.

Sorry...I edited it again while you wer posting.

Belive it or not -most- people do not game @ 1680x1050...in fact almost nobody does.
If you think that statement is not correct poll 11-15 million gamers and tell me what they say.

Game software companies already know the answere and write the software to fit the customer base.

EDIT: You can spend a BILLION Dollars to have a 30Ghz 500 core CPU and a GFX card that is 10K faster than a 8800GTX....and the game will still run the same.
a b à CPUs
March 18, 2008 2:44:21 PM

ZOldDude said:
Belive it or not -most- people do not game @ 1680x1050...in fact almost nobody does.
If you think that statement is not correct poll 11-15 million gamers and tell me what they say.


Will 1.5 million do?

1680x1050 is the third most used screen resolution on Steam...


http://www.steampowered.com/status/survey.html

Primary Display Resolution (1439533 Users)
800 x 600 --24,323 1.69 %
1024 x 768 --458,882 31.88 %
1152 x 864 --74,797 5.20 %
1280 x 960 --569,003 39.53 %
1440 x 900 --105,925 7.36 %
1600 x 1200 --24,432 1.70 %
1680 x 1050 --129,643 9.01 %
1920 x 1200 --32,576 2.26 %
Other --19,952 1.39 %
March 18, 2008 2:57:09 PM

Nope....won't do for the topic -or- for game software writers.

Your STEAM link shows only 9% of the people that play crap games on STEAM use 1600x1050.

I bet that makes CSS way better an leet....not.

EDIT: The link shows most people play STEAM games with ONE CPU core and ONE GB or LESS of ram.
That is still what softwar companies write games to run on...and if you have more better for you.

However haveing 100 times the cpu and/or common grafix card speed will never make the game run beyound whaat it was written to do.
a b à CPUs
March 18, 2008 2:58:50 PM

I never claimed that survey was anything other than people on steam...
a c 127 à CPUs
a b À AMD
March 18, 2008 3:04:07 PM

ZOldDude said:
Nope....won't do for the topic -or- for game software writers.

Your STEAM link shows only 9% of the people that play crap games on STEAM use 1600x1050.

I bet that makes CSS way better an leet....not.


Um considering that Steam has had 15million registered users that use it I highly doubt it is crap. And out of 1.5million who gave that info 9% is a lot. Also not every game through Steam is crap. In fact you've got Bioshcok and now the entire Unreal/UT series is available through Steam.

And CS:S is not a bad game. Yes it is filled with a bunch of noobs but a high resolution of 16x10 probably makes the game look very nice.

Either way this proves that more and more people are going to a higher resolution, 16x10 being the main one.
March 18, 2008 3:11:41 PM

ZOldDude said:
Well we don't know if it OC's further and we don't know for sure if it uses the same /more/less power....put we should expect the same.

What we do know is that the TLB "bug" which has only showed up in a -lab- so far is fixed.
To be honest this "bug" has never been seen outside of a -lab-.....so good call for AMD fixing the -bug- anyhow after first recalling past shipments.

Perhaps near end of the year they will ship a 3Ghz product.
No matter the maker in real life after you hit 3Ghz only a benchmark can tell if a game runs any better...and almost all software is still P4 rated.


Well we know a little from this article.
We also know a little from AnandTech's Article.

If you read the article, Tom's did do some testing.
They found that the 2.4 used a little more power than the 2.3 version.

Clearly with such a small bump in speed if there was any power savings to be found the results should have been the same or less power used.

Tom's also stated that the 2.4 showed an increase n performance that would be expected in a bump from 2.3Ghz.
AnandTech's article which was more indepth revealed the same things.

The only performance gain you can claim is that you don't need to have the TLB Errata fix enabled in BIOS.
Note: Most home users did not enable this since it mostly effected Virtualization stuff. So if you want to claim B3 is faster since that BIOS patch is not required, I will agree but most sites reviewed the Phenom w/o than enabled for most tests and provided seperate results for when the patch was in place.

The Exciting news for AMD with the B3 is that they will be able to start shipping Server CPUs now.
The B3 is of little importance to the Deskopt Market since at most it will usher in a new 2.4Ghz vs 2.3Ghz Phenom.

Some point down the road a 2.5Ghz Version may also come out on the B3 Stepping.

However, AMD has stated that faster Phenoms will not ship until 45nm comes out.
I wish the B3 had done more as it would help curb Intel prices.
March 18, 2008 3:15:54 PM

Well CSS also looks good @ 800X600.....but that is not the point is it?

There is a point that CPU Ghz and GFX card move -beyond- what the software was written for...and makeing the hardware 10K faster will not make the game look any better or run any better.

Spending 10 times the price for a CPU over 3Ghz from one brand to another is not ever going to make any game already made run any better.....and a GFX card 100K faster than the best when the game was written can never make it look better than it was written to look.

EDIT: 9% is far from "most" like I said anyhow.
a c 127 à CPUs
a b À AMD
March 18, 2008 3:28:58 PM

ZOldDude said:
Well CSS also looks good @ 800X600.....but that is not the point is it?

There is a point that CPU Ghz and GFX card move -beyond- what the software was written for...and makeing the hardware 10K faster will not make the game look any better or run any better.

Spending 10 times the price for a CPU over 3Ghz from one brand to another is not ever going to make any game already made run any better.....and a GFX card 100K faster than the best when the game was written can never make it look better than it was written to look.

EDIT: 9% is far from "most" like I said anyhow.


No you said almost no one. Thats just a small sampling of people.

Either way, if a game cannot be run on the highest setting when its released then hell yea it will look better on faster newer hardware. That happened with Doom3 and HL2. Valve released HDR which added new realism to the game.

A higher resolution will make the game look smoother and better too. It will also run faster. Sure spending $1k on a CPU is a waste but that CPU will do almost anything and whomp pretty much everything else out there in gaming, rendering and audio.
a b à CPUs
a b À AMD
March 18, 2008 4:00:53 PM

I game at 1680X1050 all the time. Its what my monitor defaults to, I'm sure plenty of people own 22in widescreens.
a b à CPUs
a b À AMD
March 18, 2008 4:02:02 PM

Also, did the article not say that at 2.4Ghz it WAS drawing more power than the lower speed models??
March 18, 2008 4:16:28 PM

zenmaster said:
Well we know a little from this article.
We also know a little from AnandTech's Article.

If you read the article, Tom's did do some testing.
They found that the 2.4 used a little more power than the 2.3 version.


That would be due to the fact that the voltage is set higher in that system than stock voltage. My guess would be they didn't clear out any previous OC settings. But the drop in part would be to make sure the TLB fix wasn't going to effect the b3 as well, which Anand pointed out, the option completely disappears in bios with a b3 phenom.

zenmaster said:
Clearly with such a small bump in speed if there was any power savings to be found the results should have been the same or less power used.


More power is being used due to the higher Voltage.

zenmaster said:
Tom's also stated that the 2.4 showed an increase n performance that would be expected in a bump from 2.3Ghz.
AnandTech's article which was more indepth revealed the same things.


Actually the Anandtech review used the ES clocked at 2.3 to compare it to the 9600be clocked at 2.3, and showed the b3 performing slightly better on the test they used, likely due to the fix for TLB errata 254 possibly being fixed in b3 as well.

zenmaster said:
The only performance gain you can claim is that you don't need to have the TLB Errata fix enabled in BIOS.
Note: Most home users did not enable this since it mostly effected Virtualization stuff. So if you want to claim B3 is faster since that BIOS patch is not required, I will agree but most sites reviewed the Phenom w/o than enabled for most tests and provided seperate results for when the patch was in place.


This is true though, granted a lot of the bios's that allow disabling the TLB fix don't implement it properly. A lot of the ones for the k9a2 plat that allow you to disable it only disable on the first core. Thats great for single threaded apps such as most games, but anything multithreaded you can tell, like winrar, the few multithreaded games, and encoding. Performance beta P.0j on the k9a2 plat is the first one I've seen that implements the disable option properly. But, a lot of the things with the phenom seem to be bios maturity related when it comes to stability. I've noticed that as the bios on the k9a2 plat matures, I've required less and less voltage to OC to the same point. For example under bios 1.1b3 I was running 2.6/2.7 at 1.262vid (1.248v actual) without C&Q enabled for stability. And now with bios P0j I can run the same speeds at 1.250v VID (1.240v actual) with C&Q enabled and completely stable. Stock voltage for a 9600BE is 1.232v actual.

zenmaster said:
The Exciting news for AMD with the B3 is that they will be able to start shipping Server CPUs now. The B3 is of little importance to the Deskopt Market since at most it will usher in a new 2.4Ghz vs 2.3Ghz Phenom. Some point down the road a 2.5Ghz Version may also come out on the B3 Stepping.


They're supposed to be releasing the 9150-9750 and 9850BE within the next month or so. The 9850BE is a 2.5ghz B3 revision. No idea when they 9950 will be coming out, Phenom FX 82 is supposed to be coming at 2.6ghz but no idea when on that for sure either. Sorta makes me wish I'd waited for B3, but then again I haven't exactly had bad luck with my b2 9600BE.

zenmaster said:
However, AMD has stated that faster Phenoms will not ship until 45nm comes out.
I wish the B3 had done more as it would help curb Intel prices.


I don't know, I think they'll hit 2.6 on 65nm, but I agree they probably won't ramp past that till 45nm as far as actual retail release speeds go. Though the BE's and FX's may be able to push past that with OCing. I can push beyond 2.6 right now, but mines an exception and not the norm. Granted that could be due to any number of things, cooling, airflow, or whatnot.
March 18, 2008 4:51:47 PM

Back to the monitor issue, from Steam's survey, 87% of the people had a monitor less than 1680 x 1050 resolution. So by far, the majority of gamers, not to mention the non-gaming commnunity, will not benefit from better CPUs running at 3 ghz or above.

As for the Phenom B3, the one tested fixed the TLB problem and ran at a faster stock speed, so I judge this a winner. It doesn't perform as well as Intel's better chips, but it does what it was meant to do. I agree with ZOldDude that for most people, the fastest chips on the market are pretty much a waste, as well as overclocking to very high speeds. It looks good for benches and bragging rights, but that's all. That said, some software, like FSX seems to demand as much from a CPU that you can feed it, and future games will probably do the same. I think we're in a state of transition at the moment and that during the next year, maybe two, a whole lot of older computers and their hardware are going to be left behind just the same as when the transition was made from 16 bit computers to 32 bit OSs, from Win 2.1 to Windows 95/98.

The one thing that I see which can throw a monkey wrench in all this is the tanking economy. If economic pressures get too great, new developement from hardware companies and software companies will slow down to a crawl.

One last note, when the new Intel chips come out in a few days, I plan to be ordering one, along with a X48 mobo, for my gaming computer. But I also figure on getting a B3 Phenom for my business computer, as its a cheap upgrade that should do the desired work.
a b à CPUs
March 18, 2008 4:58:19 PM

I know that this is somewhat off topic (but so was the monitor discussion :D  ), but does anyone know if the Dual core Phenoms will have a TDP of 45W? I want a 45W CPU for my HTPC so if the Phenom dual cores are that's great. Otheriwse I'll just go with a 4850e. Thanks.
March 18, 2008 5:11:44 PM

ZOldDude said:
Jimbo...I edited my post above yours while you submited yours.
By the way....Crysis is not a "problem" for almost everyone to run/play.

The idea that Cysis runs like a P.O.S. untill you have the latest hardware which did not exist when it was written is the BS of hardware salers and nothing more...nor will it ever be.


amen to that, theres no rule that says "its either all on high or its not worth playing". my stock 2900 pro play it fine thankyou very much.
March 18, 2008 5:12:19 PM

EXT64 said:
I know that this is somewhat off topic (but so was the monitor discussion :D  ), but does anyone know if the Dual core Phenoms will have a TDP of 45W? I want a 45W CPU for my HTPC so if the Phenom dual cores are that's great. Otheriwse I'll just go with a 4850e. Thanks.


From what I've read, the standard Phenoms run about 125wt TDP, with a low power (and lower speed) group running at 95wt TDP. So its altogether possible that a dual core Phenom will run in the 45wt window. But that is speculation and nothing more.
a b à CPUs
March 18, 2008 5:20:26 PM

Yeah, the old road maps listed the duals at 45W, but we see how wrong those were (at least clock speed wise). I hope they make them 45W and priced in the $100-$120 range. I'd buy one then.
March 18, 2008 5:34:32 PM

EXT64 said:
I know that this is somewhat off topic (but so was the monitor discussion :D  ), but does anyone know if the Dual core Phenoms will have a TDP of 45W? I want a 45W CPU for my HTPC so if the Phenom dual cores are that's great. Otheriwse I'll just go with a 4850e. Thanks.


From what I've seen DC Phenoms will have a 65W TDP. You can always undervolt it though, I'm sure theres some leeway at stock speeds.
March 18, 2008 5:57:52 PM

Mathos said:

Actually the Anandtech review used the ES clocked at 2.3 to compare it to the 9600be clocked at 2.3, and showed the b3 performing slightly better on the test they used, likely due to the fix for TLB errata 254 possibly being fixed in b3 as well.


AMD Phenom 9600 (B2 Stepping) - TLB Fix Disabled 1348 KB/s
AMD Phenom B3 @ 2.3GHz 1357 KB/s

The difference in performance is 00.6% which is likely well within the margin of error for the test.
I certainly hope that the "Performance Increase" they are touting is not "00.6%"


March 18, 2008 6:01:58 PM

epsilon84 said:
From what I've seen DC Phenoms will have a 65W TDP. You can always undervolt it though, I'm sure theres some leeway at stock speeds.


*nod* there seems to be a bit of leeway, especially on the IMC side, which appears to be the biggest part of the TDP equation. Lowering the Volts on the cores has a slight effect, but lowering the voltage on the IMC gives a lot more. And from what I've seen, people have been able to run their IMC's down around 1.1v or maybe a bit lower at the stock 1.8g or 2.0g speeds.
a b à CPUs
March 18, 2008 6:52:48 PM

I'm pretty much waiting for the benchmarks. I've said it before, if it cannot beat a x2 6400+ in gaming still .... it's still a peice of crap. Let's hope they can actually beat their on cpus!!!
March 18, 2008 7:52:44 PM

I game crysis @ 1680x1050 no AA though, all settings on high and i must say it looks way better then cod4 etc... I think Crytek did a very good job with the engine... I remember the days when the first unreal engine came out, boy oh boy.. those were the days... running unreal with my 3dfx had me amazed... and it was only at 640x480!!!!

anyway, i hope AMD can compete .. so intel's prices can go down, if not... then AMD must sell their chips at a lower price.. to compete with intel...

I just hope there are more software developers to create engine's like the one used in crysis. Sure might be buggy, sure might be demandful... but in my oppinion it's the best looking game right now at the moment... it looks wonderful @ 1680x1050... I just hope people are going to push the hardware more to its limits, so the hardware producers have to work harder, and spend more time at graphics.

I think it is a way of the future... I just hope they can make it better and better... It's good for the minority aswell as the majority. I think the only downside is that you have to upgrade constantly... but that is what we do now aswell right?

Competition = good = happy customers getting good prices
March 18, 2008 8:02:33 PM

computertech82 said:
I'm pretty much waiting for the benchmarks. I've said it before, if it cannot beat a x2 6400+ in gaming still .... it's still a peice of crap. Let's hope they can actually beat their on cpus!!!


I'm all for waiting for benchmarks before making final decissions, but even if its not quite as good as a 6400+ in gaming, it still might be better while doing business apps because it has four cores to do the work instead of only two cores, so it may not be crap be those standards.

But I expect my gaming will be done by an Intel Q9550 or Q9650 not long from now, so I won't be asking a Phenom to do that for me.
March 18, 2008 8:08:32 PM

mi1ez said:
1680x1050 is the third most used screen resolution on Steam...


Lcd screens larger than 19" are getting pretty cheap so I expect that number to go up.

You can get a 22" samsung for around $250.00
March 19, 2008 12:15:44 AM

Amen to that brother im running an x2 6400 on a k9a2 platinum and will update my next to a x4 6400 well I hope to but I wont hold my breath. this two core runs fine for me although From some article i read isnt the cpu going to change in the future to acomidate gpu fuctionality so by the time a penome 4 core 6400 comes out there will be cpu/gpu mix matches happening
as for graphis is concerned eirlier in this descusion it was mentioned about krissis and how slow the game runs I think both intel and AMD are at a crucial part in cpu evolution only AMD is faced with stumbling blocks to its soon to be out dated CPUs :-)
March 19, 2008 12:29:04 AM

Your 6400 will do better than any Phenom across a wide variety of workloads. In fact, I'm jealous of it!

No need to upgrade unless you're running highly threaded software.
March 19, 2008 12:38:31 AM

Gazz said:
Amen to that brother im running an x2 6400 on a k9a2 platinum and will update my next to a x4 6400


A X4 6400? Or do mean a 9600? Never heard of a X4 6400.

That said, for normal gaming, a X2 6400+ should be better than the present Phemons. The Phenom's main advantage is for multi-threaded apps of business use where a number of programs may be running at the same time. If AMD releases some faster Phenoms at some point, they might beat the X2 6400+ in gaming, but that's a big if and when.
March 19, 2008 2:08:16 AM

computertech82 said:
I'm pretty much waiting for the benchmarks. I've said it before, if it cannot beat a x2 6400+ in gaming still .... it's still a peice of crap. Let's hope they can actually beat their on cpus!!!


Then stop waiting. B3, even at 2.5GHz, is not going to beat an X2 6400+ in gaming. Except in Supreme Commander and Flight Simulator X, perhaps...

a b à CPUs
March 19, 2008 2:12:59 AM

Hmm, so will this drive down the prices of the B2 parts? Supposedly there is going to be a 9100e (B2) with TDP of 65W in the $100-$130. Then there is supposed to be a 9150e soon after. Although it goes against all my sense, I'm wondering if I want to grab a 9100/9150 over a 4850e or dual core Phenom. Oh well, I guess I have plenty of time to decide.
March 19, 2008 2:21:15 AM

EXT64 said:
Hmm, so will this drive down the prices of the B2 parts? Supposedly there is going to be a 9100e (B2) with TDP of 65W in the $100-$130. Then there is supposed to be a 9150e soon after. Although it goes against all my sense, I'm wondering if I want to grab a 9100/9150 over a 4850e or dual core Phenom. Oh well, I guess I have plenty of time to decide.


Since it seems you have a FX60 at the moment, you would be taking a step down in performance, maybe even two steps down. I'd either hold out for the full powered 9600 or turn to an AM2 with a 5000+ BE rather than going to a low power Phenom.
a b à CPUs
March 19, 2008 2:27:23 AM

Sorry, I wasn't clear. The FX-60 and the rest of that 400W rig aren't going anywhere. This next processor is for a TV/HTPC computer. I would have just used the FX-60/2900XT except I am now paying for electricity and 400W for TV/movies is too much. A low Phenom would definitely be a step down from an FX-60 since they are close clock-for-clock.
!