Sign in with
Sign up | Sign in
Your question

Wow Where AMD goes Intel will follow

Last response: in CPUs
Share
March 19, 2004 3:56:11 PM

Anyone for leapfrog?

First 64bit x86, now onboard <A HREF="http://www.theinquirer.net/?article=14828" target="_new">memory controllers</A>.

Is Silicon on Insulator next?

Dichromatic for your viewing plesure...

More about : wow amd intel follow

March 19, 2004 4:20:00 PM

LOL

i need to change useur name.
Related resources
March 19, 2004 8:46:33 PM

Even the "unleliable and cheep" sometimes have good ideas. Intel has the nohow (if not knowhow) to use good ideas.
March 19, 2004 10:15:51 PM

cuz intel have done first

i need to change useur name.
March 20, 2004 12:12:01 AM

The P-M originally had an on die memory controller but was removed because it didn’t offer the core any real benefits that the 3rd party controller couldn't offer.

Also there was a wafer yields target and a larger core would skew that.

So in all actuality AMD and Intel copies Alpha, who was the original company to propose this type of solution.

Xeon

<font color=orange>Scratch Here To Reveal Prize</font color=orange>
March 20, 2004 1:10:07 AM

ehh not really the CPU name was timma that was canncel about 8 year ago much before EV7 or SUN.Ev7 is simalar to k8 i a way they have the same cpu core with new L2 cache and memory.Except EV7 have a much faster memory controleur in term of bandwith 8 channel RDRAM.Intergrade link also

i need to change useur name.
March 20, 2004 5:57:35 AM

Timna was a low cost CPU incorperating on die memory controler but also video & audio if Im not mistaken... on die memory controlers have been here for a long time in all sort of hand held computers...

The on-die memory controler is not as importent as the whole NUMA aproach to multi-processing. which if im not mistaken SGI done first with its MIPS and later on DEC with EV7 and AMD with K8. AMD is the first to use it in low-scale systems using a diffrent comunication protocol then the EV7 (that one is aimed twords 16 to 64 cpu machines).

I doubt Intel will bring the memory controler to the desktop, I think it will be used only in Xeon and Itanium. in that way it will be mimking AMD...

This post is best viewed with common sense enabled<P ID="edit"><FONT SIZE=-1><EM>Edited by iiB on 03/20/04 09:58 AM.</EM></FONT></P>
March 20, 2004 7:00:12 AM

You forgot the new product numbers instead of clockspeeds. :) 

As for SOI, intel always said they would implement FD-SOI at 65nm node. I don't think they could implement PD-SOI (like AMD and IBM) much earlier, unless they started working on it at least a year or two ago, and FD-SOI wafers are not yet available AFAIK, and likely won't be available in any volume before 65nm is ready. IOW, intel is stuck with bulk (+SS) for another while.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
March 20, 2004 8:09:44 AM

You can add that Intel went copper interconnets in its 0.13u process while AMD implemented it in its 0.18u process.

This post is best viewed with common sense enabled
March 22, 2004 12:54:46 PM

Quote:
heh, but AMD is unreliable and cheap? why would Intel do anyting they would do?

That's what I thought about it until the Athlon XP serie. Actually it is the first AMD serie that is better than it's Intel competition. That's when I turned AMD.
AMD now is <b>reliable and cheap</b>
March 22, 2004 2:01:30 PM

Quote:
ehh not really the CPU name was timma that was canncel about 8 year ago much before EV7 or SUN.

Actually, DEC did an ODMC way before EV7 or Tinma. Look up Alpha 21066 (ancient, ancient EV4 core with integrated memory controller and PCI bridge).

<i>Look! Up in the sky! It's a bird! It's a plane! It's...

...an asthmatic werehamster?

<LHGPooBaa> Well, @#!& on me.</i>
March 22, 2004 4:10:36 PM

in any case that prove that AMD was the last

i need to change useur name.
March 22, 2004 4:27:33 PM

Quote:
in any case that prove that AMD was the last

Or, we could say that AMD was the first to do it "right" (the ODMC on 21066 and Tinma were both dead-end technologies). It's easy to put either a positive or negative spin on it without either view really being worth much. :wink:

<i>Look! Up in the sky! It's a bird! It's a plane! It's...

...an asthmatic werehamster?

<LHGPooBaa> Well, @#!& on me.</i>
March 22, 2004 5:46:43 PM

we can argue a long time.

i need to change useur name.
March 22, 2004 6:14:28 PM

I think the abandoning of Megahertz ratings is the biggest first step. It would be like a car company advertising their engines based on top RPM and displacement. In other words, megahertz alone doesn't tell the whole story.

The problem I have is that both AMD's and Intels new solutions don't give any insight to the uninformed customer about performance. How would a regular person know if an Athlon 64 3400+ or a Pentium 4 550 is faster? There needs to be an industry standard made. Just like all engines can be measured with horsepower, they need to agree on a way that adequately and accurately measures the capability of a processer. I know it would be tough because some chips will excel in different areas, like how Athlons kick butt in games while the Pentium 4 kicks butt in audio/video applications, but there still may be a way to do it.

-------------------------------------------
<font color=blue> "Trying is the first step towards failure." </font color=blue>
March 22, 2004 7:46:20 PM

>now if an Athlon 64 3400+ or a Pentium 4 550 is faster?

he.. if he is upgrading from an old Pentium 3-650, he might well think its slower :D 

= The views stated herein are my personal views, and not necessarily the views of my wife. =
March 22, 2004 8:25:01 PM

Quote:
That's what I thought about it until the Athlon XP serie. Actually it is the first AMD serie that is better than it's Intel competition. That's when I turned AMD.
AMD now is reliable and cheap

AMD CPUs were always reliable. It was mobo/chipsets that were holding reliablity of Athlons back till KT266A. With nForce/nForce2, AMD platform reliability has reached perfection.

------------
<A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A>

<A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig</A> & <A HREF="http://geocities.com/spitfire_x86/benchmark.html" target="_new">3DMark score</A>
Anonymous
a b à CPUs
a b Ý World of Warcraft
a b } Memory
March 30, 2004 9:36:54 PM

To you say EV& 8 channel RDRAM is ''bad'' or '''slow'''

Fell refreshing
March 31, 2004 3:02:58 AM

Basically all it is, people in the real world don't want to understand computers. Just like their cars.. they just buy what people say is the best.
People also don't research their purchases as much as we do.
The thing that get's me.. is sadly. I don't want to start a fight with anyone over this, but in my opinion. Mainly at least in the United States.. computers are only used for EMail and Internet. If that's all you plan on doing, why wouldn't you buy a duron based CPU? It's almost retarded not to, [-peep-] for 40 more dollars you can have an xp which will do these functions perfectly.

But instead people buy the 2,000 dollar Dell. With a P4 3ghz 800 fsb CPu. Why do you need that system if you are looking at porn? It definately wont go much faster. Although you could probably keep popping those ad's up like no other, but for another 150 dollars? Do you really need that kind of on the fly porn?

Now once you start getting into IT's it's understandable to use an A64 or a P4 northwood. In my mind it's not reasonable to use a p4EE for anything, but that's my AMD fanboyism or something like that.


<A HREF="http://arc.aquamark3.com/arc/arc_view.php?run=277124623" target="_new">http://arc.aquamark3.com/arc/arc_view.php?run=277124623...;/A>
46,510 , movin on up. 48k new goal. Maybe not.. :/ 
March 31, 2004 7:43:11 AM

a lot of people in the real world do only use email/net (& porn? you can get that on a computer? :wink: ), but another lot of people (although probably not <i>quite</i> so many) also want to play the latest games. They're the ones who benifit from buying good stuff.

It's almost sad when you see some poor sap buying a 'real high spec' machine with a 2.6Ghz Celeron & a geforceFX5200, and then wondering why Far Cry doesn't work as well as they think it should.

plus, idiots looking at porn and suchlike will quickly have millions of stupid little 'Porno-dialer' tray icons and the like, so if they didn't have a high-spec machine it would grind to a halt thru lack of available memory etc much quicker than a low-spec machine :lol:  . especially if they buy a Dell or something, which probably comes with loads of pre-installed crap too.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
!