Sharing an article from our friends over at Technology Review who have posted an article stating that Novel/Hewlett Packard have improved on a chip manufacturing process that extend Moore's Law beyond the currently estimated dead-end of physical transistor shrink in another decade or so.

Read the full article here.

This re-discovered process will effectively increase the number of transistors without shrinking the die by using a wire mesh to overlay the transistors and making the interconnects "vertically" rather than placing the transistors and interconnects on the same plane. 3D processors anyone?!

The article also states that this new process will be ready for commercial manufacturing come 2010. Definitely a technology worth following.

Enjoy!
 

sailer

Splendid
The article is dated January 16, 2007, so its over a year old. That's not exactly new, though it may be new here. It is an interesting find, though. I imagine that cooling wouldn't be that big a problem. An insulating layer could be placed in between the layers of transistors, with a conducting plate on the bottom layer to pipe away the heat. Don't know how it will be handled in the end, but I think HP will be studying it closely and getting it right before anything is released to the public
 

d_kuhn

Distinguished
Mar 26, 2002
704
0
18,990
Conduction cooling is likely insufficient for anything other than ultra low power devices, the energy density is just too high.

Think of it this way... current processors dissipate around 75 watts (some more, some less) each. Take that chip and make it roughly a cube, assuming each layer is maybe 100 microns thick (which is probably several times more than would be necessary). For a Core Duo that would give you about 120 processor layers... or a 240 core supercomputer in a roughly 1 cm cube.

Pretty awesome potential there... IF you can keep it cool... because that 1cm cube will be putting out 9 KILOWATTS of heat (27000 BTU), which is enough to heat 1000 square feet of space in the winter.

Once we sort out how to make computers run with a LOT less heat waste (leakage current)... then this sort of technology will become mainstream. Until then it'll be something that geeks at universities play with using cryogenic cooling.
 
I don't think the idea with this article was to actually make a cube shaped processor and keeping a 1cm cubed processor would be near impossible even using liquid or nitrogen. It would be like putting a fresh baked turkey into the freezer. To ponder poetic here, if they were to create nano-sized heatpipes to run through the cube mesh and somehow tie them into the heat spreader, then maybe...

But as stated the article wasn't about creating cube shaped procs as much as putting the processor interconnects and transistors onto two seperate layers rather than the same plane. Adn, having two layers composing the processor rathern than one wouldn't be any more difficult to cool than a processor made today.

I hope this is a technology that they continue to develop. Whether or not it makes it into desktop processors for us enthusiasts to take advantage of remains to be seen. I can't really imagine Intel buying a license from Novel/HP to integrate into their fabs, AMD perhaps given their history with Hypertransport and industry partnerships, but I rather see Intel creating their own flavor of 3D/mesh procs...