Sign in with
Sign up | Sign in
Your question

Intel demonstrates 80-core Teraflop chip

Last response: in CPUs
Share
February 11, 2007 6:26:30 PM

I know Intel talked about this first at the last IDF ... but I don't know if they demonstrated a working version before.

Intel Teraflop Chip



February 11, 2007 7:11:03 PM

8O 8O 8O
Related resources
February 11, 2007 8:11:50 PM

:lol:  Currently, I don't work anywhere that's why I have time to read all these forums. I would love to work at Intel though. :D  Why do you ask?
February 11, 2007 8:24:42 PM

Will it fit in Super Socket 7??? :lol: 
February 11, 2007 9:11:10 PM

Wow Intel gets 80 cores working to show the public before AMD's 4 :lol: 
Anywho sorry AMD, i love you anyways. Maybe Intel did this to steal some light of AMD's native Quads??? What do you guys think.
February 11, 2007 9:36:15 PM

Pretty neat stuff, eh?
February 11, 2007 9:39:00 PM

but does it run Unreal Tournament 2007 :( 
February 11, 2007 9:49:37 PM

UT2007 no, but UT2015 very likely :) 
One(or more:)  ) could be used on a physics accelerator cards. Or perhaps on a raycasting gaprhics cards...
February 11, 2007 10:09:16 PM

This is pretty much the most awesome thing ever, i mean, an actual working 80 core cpu. WOW. 8O

If intel sells these things retail there would be no reason for the C2D anymore, or any other AMD or Intel cpu for that matter. Even if there over $1000 each i would still buy one.


I wonder how it oc's? :D 
February 11, 2007 10:21:23 PM

Keep in mind, these are very simple cores - not full x86 cores. Intel is focusing on how they are networked together. The fundamentals of which, I'm sure will make it's way to x86 implementation, but Intel is saying not to look for this tech to be marketable for another 5-10 years.
February 11, 2007 10:53:03 PM

Point of developing such a chip was to experiment with intercommunication scaling with such a large number of cores. Classic point to point or crossbar approach seems inefficient there.
February 12, 2007 12:58:51 AM

Quote:
A little old news, but noteworthy nonetheless. Someone pointed to Charlie's article and his take on it, and Charlie is correct in saying it doesn't really do much.... what it does do how ever is provide a proof of concept, an 80 core network capable of over 1 TB of bandwidth.... this is done with dedicated SRAM stack on top of the chip using through silicon via technology.

In the end the 80 core chip doesn't do much but what it does do is show that a massively parallel, tile system can work and that the technology can be made to feed it data fast enough. This is the exciting part.

Imagine a chip with 16 tiles geared toward graphics, each tile capable of 8 shader operations per cycle, need more then bump that to 32 tiles, need even more? Bump it to 64 tiles.... you begin to see where this is going. Now, imagine a chip capable of putting say 64 tiles in parallel, each tile capable of calculating 8 ray traces to completion in say 10 cycles (just pulling crap out of the air as an example).... the concept of realtime ray trace rendering is not too far fetched.

Jack


Thats kind of an understatement. I had to read that article and check the author twice as it seemed so "Un" Charlie to me. He actually seemed to stick to facts, made no wild claims, quoted no "reliable" sources, left no scathing comentary of Intels inteptitude or mis estimation of market direction. I would not be suprised to see a correction to that article saying something like "sorry, we credited the wrong writer, it was really Fuad"
February 12, 2007 1:01:26 AM

Quote:
Keep in mind, these are very simple cores - not full x86 cores. Intel is focusing on how they are networked together. The fundamentals of which, I'm sure will make it's way to x86 implementation, but Intel is saying not to look for this tech to be marketable for another 5-10 years.


so , we wont be seeing one of these CPU's in our computers til at least 2010 ??

damn! , I am really power hungry , and want all the power i can get!!
February 12, 2007 2:49:07 AM

I heard something about revearse hyperthreading a while back. A bunch of cores on a cpu will work like a single core if a program isnt built for multiple cores. Sounds like a good idea, that way you can harness the full potential of your proc in a non-multi threaded app. It could be put towards alot of good use in an 80 core cpu.
February 12, 2007 3:11:27 AM

Quote:
I heard something about revearse hyperthreading a while back. A bunch of cores on a cpu will work like a single core if a program isnt built for multiple cores. Sounds like a good idea, that way you can harness the full potential of your proc in a non-multi threaded app. It could be put towards alot of good use in an 80 core cpu.


That's a bunch of fantasy. It usually takes quite a lot of tedious explicit programming to take advantage of such parallel machines. Stream programming languages may help but there is no magic to just split up arbitrary programs and deploy as parallel communicating threads of execution. :roll:
February 12, 2007 3:33:59 AM

If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.
February 12, 2007 4:03:19 AM

Quote:
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.


Nahhh, I disagree --- AMD is hard at work with the tiled/copy-paste methodology already. Hence the ATI acquisition. Intel's novelty here, I do believe strongly, is putting together the interconnect fabric to feed the beast --- it was proof of concept, not for a FLOP or a TeraFLOP, but simply can we build all the infrastructure required to feed such a monstrosity.

Here Intel may have an upper hand, but I don't think AMD purchased ATI just to get a chipset in hand and a GPU on a die..... AMD is looking forward 5-10 years downt he road.

In this case I agree with Charlie D., it was a necessary move for AMD.

Jack

It may be true that AMD needed ATI, but that doesn't mean it's going to do them much good. After the AMD/ATI merger I switched to Nvidia. I was an ATI diehard. But I have no confidence in AMD. They have always taken good ideas and just screwed them up. They are most likely going to mess up ATI[even more than they already have], just like the K5, K6-3, Socket 754, the bridged Athlon64...the list goes on. The Athlon XP was a champ for a while and AMD finally got to be top dog[albeit 9 months vs Intel's 30+ years] with the 939's, but Intel is back on top and they are going to stay there. AMD has nothing to top them. This new Intel core design could very well be the begining of the end for AMD on many fronts, which WILL be sad. And unless the new gen of Radeons REALLY kick ace, Nvidia is going to rule the GPU market.

UPDATE!!, Just saw the X2800 pics and DAMN! I rest my case. ONLY the most diehard geeks and gamers are going to put that monster in their systems[which doesn't include me]...Way to go AMD/ATI[note sarcasm].
February 12, 2007 4:03:24 AM

Quote:
AMD is hard at work with the tiled/copy-paste methodology already.


I wasn't aware of this.

Question: wouldn't heat be a big concern with cores stacked on top of each other? Seems like they'd get awfully hot.
February 12, 2007 4:13:22 AM

Quote:
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.


Nahhh, I disagree --- AMD is hard at work with the tiled/copy-paste methodology already. Hence the ATI acquisition. Intel's novelty here, I do believe strongly, is putting together the interconnect fabric to feed the beast --- it was proof of concept, not for a FLOP or a TeraFLOP, but simply can we build all the infrastructure required to feed such a monstrosity.

Here Intel may have an upper hand, but I don't think AMD purchased ATI just to get a chipset in hand and a GPU on a die..... AMD is looking forward 5-10 years downt he road.

In this case I agree with Charlie D., it was a necessary move for AMD.

Jack

It may be true that AMD needed ATI, but that doesn't mean it's going to do them much good. After the AMD/ATI merger I switched to Nvidia. I was an ATI diehard. But I have no confidence in AMD. They have always taken good ideas and just screwed them up. They are most likely going to mess up ATI[even more than they already have], just like the K5, K6-3, Socket 754, the bridged Athlon64...the list goes on.

can you please provide me with a link that shows that AMD has takens ideas and screwed them up?
where were you the last three years where AMD has had the performance crown and has has the best technology?
February 12, 2007 4:24:42 AM

Quote:
The difference between fusion here and the 80 core project is that Intel's 80 core project is designed to test the extremes not just from how many cores, but how to connect them up to work together and shuttle a lot of data/info. We have not seen something like this from AMD yet, but it has to happen.... moving to a parallel universe in a hetrogeneous manner will require some heavy lifting in the interconnect fabric to supply the necessary BW, otherwise a massively cored device will find many cores simply idling and defeats the purpose.

Jack


AMD is going to a modular approach so that multiple chip profiles for various market segments can be created with the same architecture. What do you think they will use as the on-chip interconnect? HyperTransport? Probably something simpler like Intel's approach is better suited when the number of cores is large??
February 12, 2007 4:25:00 AM

Quote:
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.


Nahhh, I disagree --- AMD is hard at work with the tiled/copy-paste methodology already. Hence the ATI acquisition. Intel's novelty here, I do believe strongly, is putting together the interconnect fabric to feed the beast --- it was proof of concept, not for a FLOP or a TeraFLOP, but simply can we build all the infrastructure required to feed such a monstrosity.

Here Intel may have an upper hand, but I don't think AMD purchased ATI just to get a chipset in hand and a GPU on a die..... AMD is looking forward 5-10 years downt he road.

In this case I agree with Charlie D., it was a necessary move for AMD.

Jack

It may be true that AMD needed ATI, but that doesn't mean it's going to do them much good. After the AMD/ATI merger I switched to Nvidia. I was an ATI diehard. But I have no confidence in AMD. They have always taken good ideas and just screwed them up. They are most likely going to mess up ATI[even more than they already have], just like the K5, K6-3, Socket 754, the bridged Athlon64...the list goes on.

can you please provide me with a link that shows that AMD has takens ideas and screwed them up?
where were you the last three years where AMD has had the performance crown and has has the best technology?

And I need to prove what to you? Where did you come up with "the last three years"? AMD was only on top with the socket 939 proc's and then for only about 8 to 9 months. The P4's where still more powerful than the AthlonXP's and socket 754 proc's. Go huff and puff elsewhere...
February 12, 2007 4:30:20 AM

so your saying that Intel has basically alwyas been the better CPU manufacturer?? interetsing!!

maybe i might start buying Intel Processors now! , and im switing to Nvidia!
February 12, 2007 4:41:51 AM

Quote:
so your saying that Intel has basically alwyas been the better CPU manufacturer?? interetsing!!

maybe i might start buying Intel Processors now! , and im switing to Nvidia!


What are you, 8 years old? 9 maybe?
February 12, 2007 4:48:27 AM

Quote:
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.


Nahhh, I disagree --- AMD is hard at work with the tiled/copy-paste methodology already. Hence the ATI acquisition. Intel's novelty here, I do believe strongly, is putting together the interconnect fabric to feed the beast --- it was proof of concept, not for a FLOP or a TeraFLOP, but simply can we build all the infrastructure required to feed such a monstrosity.

Here Intel may have an upper hand, but I don't think AMD purchased ATI just to get a chipset in hand and a GPU on a die..... AMD is looking forward 5-10 years downt he road.

In this case I agree with Charlie D., it was a necessary move for AMD.

Jack

It may be true that AMD needed ATI, but that doesn't mean it's going to do them much good. After the AMD/ATI merger I switched to Nvidia. I was an ATI diehard. But I have no confidence in AMD. They have always taken good ideas and just screwed them up. They are most likely going to mess up ATI[even more than they already have], just like the K5, K6-3, Socket 754, the bridged Athlon64...the list goes on.

can you please provide me with a link that shows that AMD has takens ideas and screwed them up?
where were you the last three years where AMD has had the performance crown and has has the best technology?

And I need to prove what to you? Where did you come up with "the last three years"? AMD was only on top with the socket 939 proc's and then for only about 8 to 9 months. The P4's where still more powerful than the AthlonXP's and socket 754 proc's. Go huff and puff elsewhere...

I'm in agreement with heyhey here. AMD hasn't "screwed" anything up. Let's take a look at a few things AMD has pioneered:

EV6 bus. AMD acquired Alpha's EV6 bus and adapted it to the original Athlon line of processors. Why is this significant? The EV6 used DDR technology. Intel subsequently stole the idea and created their own "quad pumped bus" with the Pentium 4. A lot of people credit Rambus for inventing DDR, but it was really Alpha.

x86-64. No explanation necessary really. AMD64 has proven itself. While Intel was trying to sell everyone on IA64, AMD was hard at work with a REAL solution to migrating from 32bit to 64bit. As I type this post I'm running Vista 64 bit and using x86 code for bit torrent. If IA64 had it's way my computer would be executing x86 code in "emulation mode." LOL. Fcuk that.

Dual Core. While intel was slapping two dies together on a package and connecting them via the FSB (i.e., nigg*er rigging it), AMD had already implemented a native dual core design into Athlon64. Athlon64 was designed with dual core in mind.

Hypertransport. This particular technology intertwines with dual core. AMD new that two cores would have to have an extremely fast and easy method of communication and thus hypertransport was born.

On die memory controller. Another great idea that was executed perfectly. And it's also another idea that Intel is going to steal.

Now don't get me wrong, Intel has had just as many (if not more) great ideas. But to throw a blanket statement out like "They have always taken good ideas and just screwed them up"... well... that's a little silly. So far every single one of AMD's ideas since the K6 has been an enourmous success.
February 12, 2007 4:54:23 AM

Quote:
UPDATE!!, Just saw the X2800 pics and DAMN! I rest my case. ONLY the most diehard geeks and gamers are going to put that monster in their systems[which doesn't include me]...Way to go AMD/ATI[note sarcasm].


Keep in mind that the RETAIL version of the x2800 will be 9.5 inches. That's 1 inch shorter than the 8800 GTX. The picture floating around the net is the 12 inch OEM version... which probably won't be sold thru retail channels.

Now... one thing I will say is the power draw for the x2800 is flat out ridiculous. 270W?! 8800 GTX is 180W max. But I expected nothing else from ATI. They've never been good at reducing power consumption or using efficient designs.
February 12, 2007 4:59:03 AM

Quote:
UPDATE!!, Just saw the X2800 pics and DAMN! I rest my case. ONLY the most diehard geeks and gamers are going to put that monster in their systems[which doesn't include me]...Way to go AMD/ATI[note sarcasm].


Keep in mind that the RETAIL version of the x2800 will be 9.5 inches. That's 1 inch shorter than the 8800 GTX.

The picture floating around the net is the 12 inch OEM version... which probably won't be sold thru retail channels.

Good point. Still, they're both massive, produce massive heat and are power hungry... I don't care how great they are, both companies need to scale down their products.
February 12, 2007 5:00:57 AM

Agreed. 10.5 inches is a record length.

Actually... i take that back. Before 3DFX died they released a Voodoo5 that was like 10 inches in length. They only made a hundred or so. Let me see if i can find a pic.
February 12, 2007 5:07:48 AM




It was called the Voodoo 5 6000 and looks to be between 8 and 9 inches long. I can't find the dimensions anywhere.
February 12, 2007 5:09:41 AM

Quote:


I'm in agreement with heyhey here. AMD hasn't "screwed" anything up. Let's take a look at a few things AMD has pioneered:

EV6 bus. AMD acquired Alpha's EV6 bus and adapted it to the original Athlon line of processors. Why is this significant? The EV6 used DDR technology. Intel subsequently stole the idea and created their own "quad pumped bus" with the Pentium 4. A lot of people credit Rambus for inventing DDR, but it was really Alpha.

x86-64. No explanation necessary really. AMD64 has proven itself. While Intel was trying to sell everyone on IA64, AMD was hard at work with a REAL solution to migrating from 32bit to 64bit. As I type this post I'm running Vista 64 bit and using x86 code for bit torrent. If IA64 had it's way my computer would be executing x86 code in "emulation mode." LOL. Fcuk that.

Dual Core. While intel was slapping two dies together on a package and connecting them via the FSB (i.e., nigg*er rigging it), AMD had already implemented a native dual core design into Athlon64. Athlon64 was designed with dual core in mind.

Hypertransport. This particular technology intertwines with dual core. AMD new that two cores would have to have an extremely fast and easy method of communication and thus hypertransport was born.

On die memory controller. Another great idea that was executed perfectly. And it's also another idea that Intel is going to steal.

Now don't get me wrong, Intel has had just as many (if not more) great ideas. But to throw a blanket statement out like "They have always taken good ideas and just screwed them up"... well... that's a little silly. So far every single one of AMD's ideas since the K6 has been an enourmous success.


Good points.

I agree, those are excellent points. AMD has had much success. And Intel has made some screw-ups of their own. However AMD could have been on top a LONG time ago if they had properly implemented the MANY wonderful ideas they had that either went wasted or else weren't put to market right. I admire AMD for their spunk, but I just don't trust them.
February 12, 2007 5:12:16 AM

Quote:
It is all proof of concept, and difficult to really grasp how they made it all fit in order to work.

Jack
Yes, I need to do some reading and some serious thinking to make more sense of this. Thanks for your comments.
February 12, 2007 5:13:33 AM

Ya those things are monsters too! And I wouldn't have bought those either! LOL!! I draw a line between powerful and overkill...
February 12, 2007 5:24:11 AM

Quote:


I'm in agreement with heyhey here. AMD hasn't "screwed" anything up. Let's take a look at a few things AMD has pioneered:

EV6 bus. AMD acquired Alpha's EV6 bus and adapted it to the original Athlon line of processors. Why is this significant? The EV6 used DDR technology. Intel subsequently stole the idea and created their own "quad pumped bus" with the Pentium 4. A lot of people credit Rambus for inventing DDR, but it was really Alpha.

x86-64. No explanation necessary really. AMD64 has proven itself. While Intel was trying to sell everyone on IA64, AMD was hard at work with a REAL solution to migrating from 32bit to 64bit. As I type this post I'm running Vista 64 bit and using x86 code for bit torrent. If IA64 had it's way my computer would be executing x86 code in "emulation mode." LOL. Fcuk that.

Dual Core. While intel was slapping two dies together on a package and connecting them via the FSB (i.e., nigg*er rigging it), AMD had already implemented a native dual core design into Athlon64. Athlon64 was designed with dual core in mind.

Hypertransport. This particular technology intertwines with dual core. AMD new that two cores would have to have an extremely fast and easy method of communication and thus hypertransport was born.

On die memory controller. Another great idea that was executed perfectly. And it's also another idea that Intel is going to steal.

Now don't get me wrong, Intel has had just as many (if not more) great ideas. But to throw a blanket statement out like "They have always taken good ideas and just screwed them up"... well... that's a little silly. So far every single one of AMD's ideas since the K6 has been an enourmous success.


Good points.

I agree, those are excellent points. AMD has had much success. And Intel has made some screw-ups of their own. However AMD could have been on top a LONG time ago if they had properly implemented the MANY wonderful ideas they had that either went wasted or else weren't put to market right. I admire AMD for their spunk, but I just don't trust them.

I don't trust them either :)  ... but I don't trust Intel either... in this that I say, the marketers will turn, twist, or hide data that does not reflect reality in many cases... thus, I trust my own ability to dig beyond the marketing hype. :) 

Example: The IMC, as mpjesse points out, while not novel in it's own right, was well, well executed by AMD bringing it to the x86 platform. However, an IMC helps in memory BW/latency thus decreasing the need for cache, when you really dig below the surface and check --- a large cache + slow bus can perform as well or better than a small cache + fast bus.... it is basically trade-offs in the architectural decision making process. We see this today --- Core 2 Duo certianly cannot sustain the memory BW that the AM2 can, yet C2D significantly out performs ... meaning the architecture is not depending upon massive band widths.

I do appreicate your position though, AMD has a lot of spunk, they also have outstanding designers, and they will stay in the hunt no doubt.

Jack

Oh, lets hope they stay in the game, unlike others who have gone the way of the dinosaurs[cyrix...]. Personally I hope AMD keeps Intel on it's toes and stay with us for a long time...
February 12, 2007 1:50:23 PM

Hasn't anyone on this forum heard the terms 'cell cpu' or 'software cpu'???? It looks to me that is all intel has developed here.... Which is great, but everyone needs to know they are just copying IBM here.... they have just put a bunch of cells in 1 die....
February 12, 2007 2:05:59 PM

Quote:
Hasn't anyone on this forum heard the terms 'cell cpu' or 'sotware cpu'???? It looks to me that is all intel has developed here.... Which is great, but everyone needs to know they are just copying IBM here.... they have just put a bunch of cells in 1 die....


This does look a lot more like a GPU than a CPU. I'll bet this thing would do very well with graphics. :D 
February 12, 2007 2:32:09 PM

Quote:
Hasn't anyone on this forum heard the terms 'cell cpu' or 'sotware cpu'???? It looks to me that is all intel has developed here.... Which is great, but everyone needs to know they are just copying IBM here.... they have just put a bunch of cells in 1 die....


This does look a lot more like a GPU than a CPU. I'll bet this thing would do very well with graphics. :D 

The idea is you can make each cell/core act any way you want.... you can make some act as cpus, some gpus, some math processors, etc.... So you could make 10 cores acts like cpus. and 70 act like gpus - gearing it for gfx.... Or you could make 70 cores act like cpus, and 10 like gpus - gearing it for calculations....
February 13, 2007 12:37:40 AM

I wonder how that thing would handle Oblivion?
February 13, 2007 6:25:09 AM

Quote:
I wonder how that thing would handle Oblivion?


Probably about as well as a Prescott.
!