IBM Proposes Carbon Nanotubes Instead of Silicon for Chips
IBM researchers believe to have found a way to overcome the physical limitations to shrink silicon in future computer chips.
IBM researchers believe to have found a way to overcome the physical limitations to shrink silicon in future computer chips. The company suggests that carbon nanotubes are key to smaller transistors as the material may be able to replace silicon at some point.
According to the company, it was able to produce "10,000 working transistors made of nano-sized tubes of carbon" and place them "precisely" on a single chip using "standard semiconductor processes". The placement density was one billion carbon nanotubes per square centimeter. Of course, 10,000 transistors are a far cry from the more than 1 billion transistors that are placed on today's CPUs. The precision rate of 99.8 percent appears to be close to the required 99.999 percent to achieve 1 billion transistors, but those extra 0.199 percent are more difficult to achieve than the previous 99.8 percent.
Nevertheless, IBM's announcement is remarkable and the company states that there is reason to believe that carbon nanotube transistors are likely to "replace and outperform silicon technology". In its research, IBM said it was able to position the carbon transistors by creating a circuit pattern on a substrate using a chemically-modified hafnium oxide (HfO2) and the rest of silicon oxide (SiO2). The added carbon nanotubes attached themselves to the HfO2via a "chemical bond", IBM said.
Carbon transistors are nowhere near commercial production, but IBM clearly sees the technology as a viable approach to build transistors that are "a few tens of atoms across". The company said that the material itself is also more attractive than silicon as electrons can move easier in carbon transistors than in silicon-based devices, which would result in faster chips.
http://www.sciencedaily.com/releases/2012/02/120219191244.htm
Miniaturization has taken us quite far from the early 80's, but it's time to start looking towards other innovations to improve computing power. At some point, once nano structures have bottomed out in size, we will need to go back to clock speed and IPC to improve performance on a classical (i.e. non-quantum) computer. Maybe photonics hold the key here, but it's getting to be time for Intel to pull something truly revolutionary out from behind the R&D curtain.
Kudos to IBM. You may be a dinosaur, but you still got the R&D curve and last I heard they were working on an 1Tbps Optical Interconnect for CPU bandwidth.
http://www.extremetech.com/computing/121587-ibm-creates-cheap-standard-cmos-1tbps-holey-optochip
Let the good times keep rollin
5nm , you start getting funny behavior aside from just plain old electon bleeding, but effects like quantum tunneling.
Unfortunately, IBM doesn't quite fall into the Intel/Boeing/Lockheed camp. They don't bet the company each time they design a new jumbo-jet, or construct a new chip-Fab, or design a prototype warplane. And even Intel doesn't risk as much as these other companies. If they did, or IBM did, we might see commercial nanotube transistor-based ICs sooner.
they have already 'started' creating problems, which is why Intel started doing the '3D transistor' to help with leakage issues. On a hard structure level I believe the limit is 4-5nm, but for reliable signal flow through a transistor it is theorized that we can get down somewhere near 8-10nm using current materials.
Remember, the smaller you go, the easier it is for electrons to slip through because there is less area to slow things down. However, the smaller you go the easier it is to melt things due to heat because the structures are so much more fragile. So this means that you need substantially less power used to keep things from burning up, but you have an ever shrinking signal to noise ratio, which is why we will likely never make it all the way down to the hard limit of 5nm
This is old news.
Silicon is woefully inefficient as are numerous other materials used in computers (and throughout industry at large).
Switching to synthetic materials that can be made in abundance (and incidentally ones that are superior) would have solved a lot of our 'problems' - but Capitalism sooner milks outdated/old techs as much as possible for the sake of profits.
I loathe the technological obscurity the system forces onto everyone.
Not to mention things like solar flares affecting them...didn't Toyota have some issue with this?
There is a world of difference between seeing and patenting a usage, and actually bringing it into production. This article is about a production improvement.
Bringing carbon nanotubes into the play back in 1992 would have created first prototypes then which would be hybridized with silicon and therefore increase efficiency across the board (oh and lets not forget using them in lithium ion batteries, laptop/desktop casings, displays, etc.).
Synthetic diamonds could have followed suit in 1996.
Oh and commercial companies like Intel use Silicon and other inefficient materials in electronics because they are 'cheap' from a $$ point of view, and they DON'T create electronics with the BEST possible results the material is capable of the times - instead they create much less efficient electronics/tools that they will revise once every 12 to 24 months because that brings long term profits.
We live in technological obscurity and have resource shortages because of this profit nonsense.
If profits weren't the goal, then switching to superior synthetic materials a long time ago and creating the BEST of what is possible/efficient and in line with our latest scientific knowledge, the children toys we have today would be skipped and we'd be more advanced in turn.