Sign in with
Sign up | Sign in
Your question

Kuma cancelled. - Page 2

Last response: in CPUs
Share
June 10, 2008 7:52:10 PM

Gill is the intel equivalent of thunderman.
June 10, 2008 7:54:31 PM

modtech said:
Gill is the intel equivalent of thunderman.


O come on now, Gill at least has facts on his side, even if he chooses a rather grating manner to disperse them. Thundernoob is just a tool.
June 10, 2008 8:05:03 PM

i ask a question everytime i come on the forum you know B-unit.

"when will we stop argueing?":p 
Related resources
a c 96 à CPUs
a b À AMD
June 10, 2008 8:10:36 PM

jimmysmitty said:
Well you are right in the 3DMark06 charts. It does take a 2.66GHz chip to beat a 3.2GHz K8. But then again the 3DMark06 doesn't show real world performance. Its a nice number to look at but its not all important.

But one thing. In the XVid benchmark do me a favor. Click the link at the bottom of the chart that says "View all Products". It will show all the CPUs tested since 2007. You will see that a E6600 (2.4GHz chip) beats a 6400+ X2. So thats a 800MHz clock advantage and it gets beat.

So he was close. A 2.2GHz Core 2 will beat a 3GHz K8. 800MHz clock difference. Next time make sure you check the whole chart and not what seems to be.


I don't know if you pick a guitar ... or pick your nose ... but you sure know how to pick your cherry. :whistle: 
a c 125 à CPUs
a b À AMD
June 10, 2008 8:17:56 PM

Wisecracker said:
I don't know if you pick a guitar ... or pick your nose ... but you sure know how to pick your cherry. :whistle: 


I didn't pick it. It was B-Unit. He picked it. I just let him know that on that particular one he posted a 2.4GHz C2D beat a 3.2GHz X2 with a 800MHz disadvantage.

I don't see 3DMark06 as a viable measurement overall. I only look at real world examples.

But my point still stands that you have to look at the real world and that in most cases a 2.4GHz C2D will easily outpace a 6400+ X2.
June 10, 2008 8:31:11 PM

iluvgillgill said:
i ask a question everytime i come on the forum you know B-unit.

"when will we stop argueing?":p 


I think we both have the same problem, we think we read one thing when the other typed something else...

Ah well, nothing like a good flame war to keep the next ice age at bay. :whistle: 
June 10, 2008 8:34:36 PM

I will be an Intel fanboy and tell you what’s really wrong with AMD, as the answer is simple...

They can not keep up with a Monster company like Intel; they simply do not have the resources or staff. Perhaps AMD's inadequacies aren’t really as bad as they appear; it is Intel OWNING them that makes them seem horrible, but in fact they are actually decent….Just not compared to the Monster Intel.


**It must be noted a few years ago I was AMD Fanboy… whoever is better im a fanboy of, I jump on/off the wagon more times then an alcoholic in a liquor store.
June 10, 2008 8:34:40 PM

haha yeah true.

or maybe Intel and AMD join together?so we got nothing to argue about!lol
a b à CPUs
June 10, 2008 9:18:12 PM

HAHA AMD brought out a quad 1st and a x2 4400 clocked at 3.6ghz. CLOWN
a c 125 à CPUs
a b À AMD
June 10, 2008 9:31:35 PM

someguy7 said:
HAHA AMD brought out a quad 1st and a x2 4400 clocked at 3.6ghz. CLOWN


Um what? No. Intel had the first quad, QX6700. Amd had the first true quad, Phenom.
June 10, 2008 9:35:50 PM

dont worry about it he jus "someguy" as his name suggested!
June 10, 2008 9:41:35 PM

jimmysmitty said:
Um what? No. Intel had the first quad, QX6700. Amd had the first truenative quad, Phenom.


Fixed

And WTH, AMD has never relased anything clocked higher than 3.2Ghz.
June 10, 2008 10:23:43 PM

B-Unit said:
O come on now, Gill at least has facts on his side, even if he chooses a rather grating manner to disperse them. Thundernoob is just a tool.


Facts or not, gil is just as much of an annoying fanboy and tool. It makes me defend AMD even with being a big Core2 fan
June 10, 2008 11:03:41 PM

Quote:
Kuma Canceled


Oh MAN! lol....it just keeps getting better and better. Dang, AMD is in a tight spot. Their aging K8 might end up going for the record of longest living architecture of all time! Just one more year to tie Intel's 6 year Netburst record! Go AMD GO!!!!

a b à CPUs
June 10, 2008 11:22:56 PM

God I love this war (DAAMIT vs Intel)!
June 11, 2008 12:13:16 AM

i think the fanboy intel vs amd in here is more intense then the actual fight going on out there!INTERESTING!!!
a b à CPUs
June 11, 2008 12:49:21 AM

iocedmyself said:
Ok first off, all AMD multi-core cpus are native-dual/tri/quad

Second, AMD bought out quads first because their AM2 dual-cores were already compatible with the AM2+/phenom platforms. Dual-cores have been around for 5 years, they are getting to be the entry level cpu solution. Anyone that is still using a single core chip today would still see huge performance gains getting an AM2 dual core in a system that would cost less than $300 to build. Anyone that is using a dual-core system is going to be more inclined to purchase a quad-core rig.

Lastly....those benchmarks are biased simpily by the test platform being 32bit. Intel has rather lackluster performance in 64bit applications, where as AMD excels in 64bit computing. Benchmarks that test 64bit hardware in 32bit OS's with 32bit software just annoy me. But rather than just rant about the idiocy and unfairness of the practice i'll just give numbers.

Cinebench, much like 3DS max 9, maya, softimage and all the other 3D modeling and animation software...is 64bit compatible, but than so are all of the other benchmarking suites commonly used. I'm still running a 939 socket 4400x2 toledo core cpu, on a DFI cff3200 Dr/g mobo, with 4 gigs of ram and an HD2900xt gfx card. I however chooseto run my 64bit hardware on 64bit vista ultimate. Now while i normally run my cpu OCed at 3.6ghz, to be fair i set the clock back down to 2.2ghz to run cinebench.

Running the 32bit benchmark at 2.2ghz i scored a meger xcpu 3023
Running the 64bit benchmark at 2.2ghz i scored a meger xcpu 3540

hmm...my amd cpu performed 17% better when running the same application in 64bit version.

Sooo....it would be a fair assumption that the 7625 score the phenom x4 9750 recieved in 32bit cinebench would be around 8921 running 64bit cinebench. Maybe more, maybe less, but certainly higher then the 32bit benchmark. As for intel's performance, i really don't know what the difference would be as i have not seen many 64bit benchmarks run on their platforms



I was quoting some silly AMD fanboy post from this thread, and then get jumped on by the stupid Intel fanboys. CLASSIC


a b à CPUs
June 11, 2008 7:11:30 AM

B-Unit said:
O come on now, Gill at least has facts on his side, even if he chooses a rather grating manner to disperse them. Thundernoob is just a tool.


Wow, we've gotten to the point were AMD's fans will attack each other

*gets skewered by javalien*

Hmm... whose the local medic?
June 11, 2008 9:03:42 AM

iluvgillgill said:
^
i mean 2900 is not a bad card at all!but its AMD's driver support that stop it from bringing out its full potential.no offence to the HD2900 i just wanna P him off and correct him if you know what i mean!

Are you saying that nvidia's seldom if ever, hackneed driver support is better than Ati's?
This place is a lot about credability, and you just wasted all of yours.
June 11, 2008 11:32:06 AM

whats this kuma-tion all about?
a b à CPUs
June 11, 2008 12:51:01 PM

Lets face it Gill your an illiterate annoying tool.



Have a nice day :) 
June 11, 2008 1:06:52 PM

Ill try to make a coherent post, even if this thread just derailed to Davy Jones Locker.

Ideologically

Its seems to me a logic decision to drop X2 development. First off, the way is multi-core or multi-threading.
So beating on the X2 past was stupidity. The X2 were good CPUs (hey im running one, and for what i have payed, it is a good CPU), but its time to drop them. AMD also doesnt need a "cheapo" line. They are already aiming at the "Value" CPUs.

Fabrication

Excellent decision. AMD will only make Barcelona's core (X4 & X3). There is basically 1 chip being made. Due to massification, it will make the CPUs cheaper to produce. if you were to reduce to 45nm, you will only have to reduce 1 chip. You will optimize your line of production for 1 chip. Seems to me a wise decision.

Development

Nehalem or Larrabee will be there in the next year only. Some versions are already out for testing. But hey, their hardly final, and they will be expensive. If AMD doesn't show any good competing CPU, Intel has no reason whatsoever to lower its pricing. That is a "heads up" for you Intel Fan boys. if you wanna keep your Intel CPUs, and if AMD isn't pulling some goodies, be ready to pay the price. I want AMD to show some goodies, not only because i am a AMD fan boy, but because Intel chips pricing will go over the roof. Back on sub-topic, fine-tunning One Core (Barcy) is easy that fine tunning several. With limited staff and money, its a wise decision.

For those reasons i believe dropping Kuma was inevitable. Although might seem a risky decision, seems to me a good one. Why buy a Celeron (witch they always sucked, and here i believe there isn't much to argue) when you can buy a X3 ? or the remaining X2 in stock. The real Question is where is the Sempron line going to ?
June 11, 2008 1:40:14 PM

Reynod said:
Lets face it Gill your an illiterate annoying tool.



Have a nice day :) 


WOW!!!so impressive!!!NOT!!!
June 11, 2008 1:41:21 PM

endyen said:
Are you saying that nvidia's seldom if ever, hackneed driver support is better than Ati's?
This place is a lot about credability, and you just wasted all of yours.


but apparrantly even though those 8series driver is more then half year old.but the performance its giving is way pass AMD's offering at the same level.does that mean anything to you?maybe not i guess.
June 11, 2008 2:50:31 PM

iluvgillgill said:
haha yeah true.

or maybe Intel and AMD join together?so we got nothing to argue about!lol


hah! you'll find something.
if its a co-design between the two? yeaaahh....say AMD did the memory controller....so then we hear:
"chip would rock if the MC didn't SUUUUK!"
"you're a monkey-banger, dude....the MC is the only part of the arch that works!"
"you're an effing MC fanboi!"
...
and on and on it goes, repeat until dead.
then, repeat it some more.

a b à CPUs
June 11, 2008 5:04:25 PM

Quote:
Why buy a Celeron (witch they always sucked, and here i believe there isn't much to argue) when you can buy a X3 ? or the remaining X2 in stock. The real Question is where is the Sempron line going to ?

True. EXCEPT most of the low end Celerons/E2xx0 (with out OCing the E2x00s are just cr@p) CPUs are being bought by OEMS (ie. Dell,HP,etc). I believe the reason for this is AMD is not able to make enough CPUs to cover demand.
a c 125 à CPUs
a b À AMD
June 11, 2008 5:18:05 PM

someguy7, I wasnt jumping on you. But you didn't have it quoted hence I thought you said it.

As for AMD dropping Kuma, They still messed up. Why buy a X3 when you can get a X4 for near the same price? Why buy a X4 when for a bit more you can get a C2Q 6600?
June 11, 2008 5:31:42 PM

Reynod said:
Lets face it Gill your an illiterate annoying tool.



Have a nice day :) 


The next time you try to insult someone you might want to learn the difference between "your" and "you're".

I find it somewhat amusing to read some of the posts on these forums. You have Gill whose keyboard doesn't seem to have a Shift key and whose space bar doesn't seem to work after periods. I generally skim over most of his posts because they are hard to read due to the keyboard issues he seems to have. Some of his posts do have facts so I wouldn't necessarily throw him in the Thunderman category, but he definitely likes to side with Intel.

On the other hand you have Reynod whose posts usually try to annoy Intel fans and generally have no useful information in them.

When the two keep posting in the same thread it can make for an interesting read (assuming Gill's posts are readable.) :p 

June 11, 2008 6:10:40 PM

Quote:
Why buy a Celeron (witch they always sucked, and here i believe there isn't much to argue) when you can buy a X3 ?
If the X3 costs as little as a celeron, I could see your pont.
AMD's K8 core, while incredibly strong in it's day, is now like Intel's old Netburst line back in 2005; still semi-competitive in some tasks, but overall severely out powered. You know it's bad when Intel's Celeron can match the performance of AMD's Athlon64 clock for clock in gaming and trounce it in encoding, while using less energy! The X2 line is competitively priced and still a viable alternative to Intel for those who do not overclock their processors. The fact that you can get the X2 4800 for just $60 is simply amazing. Just two years ago such performance would have cost $400+. However, AMD's X3 lineup, and for the most part their X4 lineup, are all vastly inferior to Intel's offerings at the same price points. It seems to me that AMD's old K8 is so good, or, the K10 is so poor, that AMD users have to decide between a fast K8 dual core, or a slow and hot AMD Phenom x3 or quad. Until the k10 can clock high enough, AMD will be forced to drag along their K8 line to support the Single core and dual core markets. It's just unfortunate, because AMD is not going to be updating these CPUs... the X2 you could buy in 2005 will basically the same X2 you could get today, and maybe even a year from now!! In contrast, Intel went from Pentium D to Conroe, to Penryn, and perhaps will even have a dual core Nehalem by the time AMD finally retires K8! Now, that would be really bad.
June 11, 2008 9:11:54 PM

uguv said:
The next time you try to insult someone you might want to learn the difference between "your" and "you're".

I find it somewhat amusing to read some of the posts on these forums. You have Gill whose keyboard doesn't seem to have a Shift key and whose space bar doesn't seem to work after periods. I generally skim over most of his posts because they are hard to read due to the keyboard issues he seems to have. Some of his posts do have facts so I wouldn't necessarily throw him in the Thunderman category, but he definitely likes to side with Intel.

On the other hand you have Reynod whose posts usually try to annoy Intel fans and generally have no useful information in them.

When the two keep posting in the same thread it can make for an interesting read (assuming Gill's posts are readable.) :p 


the problem you mention is simply because I use MSN chat too much and as you may or may not know we dont use CAPITAL or S P A C E OR SHIFT to take out the delays in typing. That's why I always get told off from the place I work and back in school getting told off by the teachers for not typing properly and dont use upper case.

Interesting I found Reynod more offensive then B-Unit for some reason. This forum is certainly interesting! :lol: 
June 11, 2008 10:08:55 PM

joefriday said:
Quote:
Why buy a Celeron (witch they always sucked, and here i believe there isn't much to argue) when you can buy a X3 ?
If the X3 costs as little as a celeron, I could see your pont.
AMD's K8 core, while incredibly strong in it's day, is now like Intel's old Netburst line back in 2005; still semi-competitive in some tasks, but overall severely out powered. You know it's bad when Intel's Celeron can match the performance of AMD's Athlon64 clock for clock in gaming and trounce it in encoding, while using less energy! The X2 line is competitively priced and still a viable alternative to Intel for those who do not overclock their processors. The fact that you can get the X2 4800 for just $60 is simply amazing. Just two years ago such performance would have cost $400+. However, AMD's X3 lineup, and for the most part their X4 lineup, are all vastly inferior to Intel's offerings at the same price points. It seems to me that AMD's old K8 is so good, or, the K10 is so poor, that AMD users have to decide between a fast K8 dual core, or a slow and hot AMD Phenom x3 or quad. Until the k10 can clock high enough, AMD will be forced to drag along their K8 line to support the Single core and dual core markets. It's just unfortunate, because AMD is not going to be updating these CPUs... the X2 you could buy in 2005 will basically the same X2 you could get today, and maybe even a year from now!! In contrast, Intel went from Pentium D to Conroe, to Penryn, and perhaps will even have a dual core Nehalem by the time AMD finally retires K8! Now, that would be really bad.


Naa, i believe the X2 will be retired now. Due to massivication X3 will become cheaper, and the X4 will have more variety.
Maybe they will come out with a 6 core. Or another Core model. I think. The next few months wil be interesting though. The performance crown will be Intel for sure in the next year or more, but i think AMD will do just fine.

We have 2 possible opinions/forecasts. Lets play cards and watch the fireworks while they work !!!
June 11, 2008 10:22:47 PM

spongebob said:
:pfff:  A song the AMD fanboys sing to themselves whilst crying in their cheerios. Scroll up to Yomama's post (it's the second post) and read his signature.


The market doesn't care what's in a PC, let alone a package. Intel survived with lousy space heater Netburst architecture in an OEM market that's now keeping AMD alive now that it's a level playing field.

I'm not crying in my cheerios as my self worth is not based upon "my" product being faster than "your" product. IMHO, AMD is a "two out of three ain't bad" situation right now. I have a nice Gigabyte 780G board waiting for either an 8750 or something better next September as I decided not to transfer the X2 over. I'll tolerate AMD's CPU performance being a bit less than Intel's CPU performance because I prefer ATI GPU's and AMD/ATI chipsets.

As long as I can get the image quality with free AVIVO in my GPU's while still doing well enough in games, then I don't care if Nvidia fudges this or that benchmark's image quality to get a few extra fps. Intel doesn't have the chipsets at the price point I build our PC's, so I don't even look at their CPU's nowadays.

If AMD catches up to Intel in CPU's after Deneb, then so be it. If they don't, then I'm sure they'll survive at the low to midrange. I don't insist that AMD survive just to lower the cost of Intel or Nvidia tech through competition. If AMD died as a CPU company, I'm sure that ATI chipsets and GPU's will carry on, and at least Intel supports Crossfire in case I want to go the CrossfireX route down the line.




a c 125 à CPUs
a b À AMD
June 11, 2008 11:08:25 PM

joefriday said:
Quote:
Why buy a Celeron (witch they always sucked, and here i believe there isn't much to argue) when you can buy a X3 ?
If the X3 costs as little as a celeron, I could see your pont.
AMD's K8 core, while incredibly strong in it's day, is now like Intel's old Netburst line back in 2005; still semi-competitive in some tasks, but overall severely out powered. You know it's bad when Intel's Celeron can match the performance of AMD's Athlon64 clock for clock in gaming and trounce it in encoding, while using less energy! The X2 line is competitively priced and still a viable alternative to Intel for those who do not overclock their processors. The fact that you can get the X2 4800 for just $60 is simply amazing. Just two years ago such performance would have cost $400+. However, AMD's X3 lineup, and for the most part their X4 lineup, are all vastly inferior to Intel's offerings at the same price points. It seems to me that AMD's old K8 is so good, or, the K10 is so poor, that AMD users have to decide between a fast K8 dual core, or a slow and hot AMD Phenom x3 or quad. Until the k10 can clock high enough, AMD will be forced to drag along their K8 line to support the Single core and dual core markets. It's just unfortunate, because AMD is not going to be updating these CPUs... the X2 you could buy in 2005 will basically the same X2 you could get today, and maybe even a year from now!! In contrast, Intel went from Pentium D to Conroe, to Penryn, and perhaps will even have a dual core Nehalem by the time AMD finally retires K8! Now, that would be really bad.


There will be a dual core Nehalem. But it will probably be more like the Pentium DC and will be the one with the GPU in package which will probably work out great for business PCs just like AMDs Fusion will if that ever comes to frutition.

yipsl said:
The market doesn't care what's in a PC, let alone a package. Intel survived with lousy space heater Netburst architecture in an OEM market that's now keeping AMD alive now that it's a level playing field.

I'm not crying in my cheerios as my self worth is not based upon "my" product being faster than "your" product. IMHO, AMD is a "two out of three ain't bad" situation right now. I have a nice Gigabyte 780G board waiting for either an 8750 or something better next September as I decided not to transfer the X2 over. I'll tolerate AMD's CPU performance being a bit less than Intel's CPU performance because I prefer ATI GPU's and AMD/ATI chipsets.

As long as I can get the image quality with free AVIVO in my GPU's while still doing well enough in games, then I don't care if Nvidia fudges this or that benchmark's image quality to get a few extra fps. Intel doesn't have the chipsets at the price point I build our PC's, so I don't even look at their CPU's nowadays.

If AMD catches up to Intel in CPU's after Deneb, then so be it. If they don't, then I'm sure they'll survive at the low to midrange. I don't insist that AMD survive just to lower the cost of Intel or Nvidia tech through competition. If AMD died as a CPU company, I'm sure that ATI chipsets and GPU's will carry on, and at least Intel supports Crossfire in case I want to go the CrossfireX route down the line.


What I don't get is you talk about how expensiv it is for a Intel chipset when a good P35 mobo, the Gigabyte DS3lL or is it DSL3, is only around $80 bucks. And you can always get a bit higher end mobo with tons of features and extras like the Asus P5K-E for pretty cheap as well, normall in the $100-$150 range. But Asus always has been more of a premium.

The 780G is great for a HTPC but personally I would prefer to just grab a nice mobo with a good chipset that allows good OCing and throw in a nice high end GPU. I too prefer ATI GPUs but their chipsets are not like they were, minus the 780G.

AMD will not die as a CPU company. Intel would not let that happen as that will reflect on them even if AMD made all ther wrong choices at the wrong time that got them in the red such as buying ATI when Conroe was out smaking their CPUs around.
June 12, 2008 5:44:53 AM

iluvgillgill said:
but apparrantly even though those 8series driver is more then half year old.but the performance its giving is way pass AMD's offering at the same level.does that mean anything to you?maybe not i guess.

See the problem?
Perhaps you should go to the gfx forum and ask the mod thier what is wrong with nvidia's drivers. He's better at explaining to the unenlightened.
a b à CPUs
June 12, 2008 9:22:02 AM

iluvgillgill said:
the problem you mention is simply because I use MSN chat too much and as you may or may not know we dont use CAPITAL or S P A C E OR SHIFT to take out the delays in typing. That's why I always get told off from the place I work and back in school getting told off by the teachers for not typing properly and dont use upper case.

Interesting I found Reynod more offensive then B-Unit for some reason. This forum is certainly interesting! :lol: 


Yup, it sure is interesting...
You should talk to AMDfanboi... almost all my bias is gone...
a b à CPUs
June 12, 2008 9:42:20 AM

uguv ur a primary school teacher right??

suck it up ...

When I find something useful to say then I'll post something astounding.

I am short on astounding today ... sorry.

Since their is so much crap in this thread I will just throw hippo eggs around.

It is much more interesting since the content here is poor, aimless, and useless.

June 12, 2008 9:44:22 AM

Just to throw another cat amongst the pigeons in here... a couple of things I'd like to mention :) 

First, and most related to the posts original topic ... AMD have denied that Kuma is cancelled and state that it is going to be released 2H08. However, Kuma is apparently *not* a dual core phenom, and they were quite keen on disassociating Kuma with the Phenom brand name.

http://news.cnet.com/8301-10784_3-9966067-7.html

Second, while ATI's performance (due to drivers or whatever) is currently lagging behind nvidia's, their image quality is still superior. This week I've been building an HTPC for my parents based around an HD2400 Pro ... I watched some HD video using the same monitor for both it and my main rig's 9600gt. I took a little time to make sure colour calibration for the screen was done properly, and then fired the video up - the 2400 pro delivered a much nicer picture :) 

(Take the next bit with a pinch of salt, as it may not happen ... dependent on MS)

In addition to this, since the 2xxx ATI cards were introduced, ATI have implemented their new tesselation engine. Recent rumours etc suggest that MS may be adding some support for this into the directx API sometime in the not too distant future. If this is true, then ATI cards going back to the 2xxx series will all receive a decent performance boost in future titles which support this feature.

Edit due to my forgetting to actually finish a sentence (2400pro image quality comparison).
a b à CPUs
June 12, 2008 10:21:07 AM

useful coret ... thanks.
June 12, 2008 1:30:00 PM

Good job coret. Nice to see at least someone is actually checking facts and posting something useful rather than pointless garbage.

Kuma is not canceled. It's not a phenom with disabled cores, it comes from an entirely dedicated manufacturing process. I speculate it will be a lower-midrange to entry level product. But since it's not a phenom it might overclock like a K8.
I'd definitely pick ATI for a windows based HTPC. (In the future there's a good chance of that applying to a Linux based HTPC too, but not today)
tessellation is interesting but don't count on it. (In other words don't base your purchasing decision on this)
June 12, 2008 3:58:35 PM

Reynod said:
uguv ur a primary school teacher right??


Incorrect

Quote:
Since their is so much crap in this thread I will just throw hippo eggs around.


I think you mean, "Since there is so much crap..." :kaola: 


Just messing with ya Reynod. :)  I have read several of your posts where you have some good information, but others just seem like they're made to bother the "blue" fanbois.





a c 125 à CPUs
a b À AMD
June 12, 2008 7:36:31 PM

modtech said:
Good job coret. Nice to see at least someone is actually checking facts and posting something useful rather than pointless garbage.

Kuma is not canceled. It's not a phenom with disabled cores, it comes from an entirely dedicated manufacturing process. I speculate it will be a lower-midrange to entry level product. But since it's not a phenom it might overclock like a K8.
I'd definitely pick ATI for a windows based HTPC. (In the future there's a good chance of that applying to a Linux based HTPC too, but not today)
tessellation is interesting but don't count on it. (In other words don't base your purchasing decision on this)


Hmmm. I always though Kuma was AMDs phenom like dual core. Then I heard they were doing a revamped K8 and maybe Kuma took over in that name. From what I have heard, if AMD does do a dual core and it is the K8 revamp it will be K8 with Phenom features added on. So even with its K8 roots the OCing will not be guaranteed till release and even with the OC ability of a K8 that doesn't help on a clock per clock lvl.

But with thier tri cores so cheap I almost agree that there is close to no way for AMD to create and sell dual cores and make any money.
June 12, 2008 11:13:11 PM

L3 cache on a dual core looks like a waste to me. They might make some money with the L3 cache-less variant (rana) with low clocks. If there is indeed a launch in Q3 we should be hearing more about this very soon.
June 13, 2008 12:30:46 AM

iluvgillgill - rarely do i speak cos basically there are so many people on here who can give a better answer so i only answer on the things that i feel safe on so i don't spread crap that could hurt anyones rig, you however are a troll, saying that you arn't does not make it so, you may answer a criticism in a thread well but those of us who view the forums as a whole can't help but see how you seem to take an almost childlike glee from reporting anything remotely negative to AMD. Even the open intel fanboys are probably slightly embarrassed.....but more respect to them for at least being open about it. Now please feel free to respond about how i know nothing in your broken english, probably sat at your desk with a stonk on because AMD have lost 1c on the stock market or something

Peace

carod
June 13, 2008 12:36:54 AM

and yes you have more posts than me (based on a reason for superiority in another post by yourself) o. m. g. dang i must lose this life i have with real people in it so i can post more


NOT
June 13, 2008 2:41:00 AM

I wonder if Kuma is actually based on Turion Ultra.
a b à CPUs
June 13, 2008 6:34:43 AM

^ Wasn't something going around saying that Turion Ultra was K8 + M780G?

FUD these days...
June 13, 2008 6:39:11 AM

From my understanding, Turion Ultra was a hybrid of K8 core and K10 interconnect.

Can't remember the codename for it.
a b à CPUs
June 13, 2008 6:45:07 AM

Puma?
June 13, 2008 7:05:30 AM

Yeh. Thanks for the reminder :) .

Because from logistic point of view, it doesn't make sense for AMD to down bin Phenom X4, or create a whole new mask for Kuma. The only possible explanations are, 1. Kuma = K8, or 2. Kuma = Puma
a b à CPUs
June 13, 2008 8:27:04 AM

More likely K8 cause since Phenom is native quad-core and Kuma must be native dual-core, thus requiring a die redesign... I'm sure any sane AMD fanatic knows that dual-core is coming of age.. since AMD is so behind it wouldn't make sense to leap backwards... unless they wanted higher clocks... :) 

Still I believe they are focusing on Shanghai and Deneb and shouldn't focus on Kuma...
!