ARM Pitches Tri-gate Transistors for 20nm and Beyond

According to a blog post by Jean Luc Pelloie, the company's Fellow Director of SOI Technology, 20 nm may represent an inflection point in which it will be necessary to transition from a metal-oxide semiconductor field-effect transistor (MOSFET) to Fin-Shaped Field Effect Transistors (FinFET) or 3D transistors, which Intel refers to as tri-gate designs that are set to debut with the company's 22 nm Ivy Bridge product generation.

Pelloie explains that it is "increasingly difficult to control the vertical electric field between the gate and the substrate while maintaining the channel depletion below threshold and then minimize the leakage current" when shrinking transistors. FinFETs are considered a solution to solve this problem. The engineer wrote that FiNFETs could be used on either bulk or SOI wafers. However, ARM still has work to do and especially investigate the scalability of this technology. "However, 3D devices are clearly on the road for sub-20nm nodes … and FinFET’s time may finally be here," Pelloie wrote.

FinFET history goes back to a December 2000 paper published in IEEE Transactions on Electron Devices entitled "FinFET - A Self-Aligned Double Gate MOSFET Scalable to 20 nm". Intel was first to move to tri-gates and take this technology into mass-production, but was accused of having switched to 3D transistors only because it was not able to scale its SRAMs, which the company traditionally uses to unveil a manufacturing process, from 32 nm to 22 nm. Intel's tri-gate technology was announced back in May of this year. Intel hopes that tri-gate technology will keep up with Moore's Law and especially drive low-voltage and low-power features.

Create a new thread in the US News comments forum about this subject
This thread is closed for comments
22 comments
    Your comment
    Top Comments
  • If we could look 200 years into the future I don't think we would even understand what we were looking at concerning computers.
    21
  • Don't you just wish you could take a peek at the computer news in say the year 2212?

    We are all quick to dismiss this CPU and that storage device as too slow, too expensive and too whatever. But we have come a long way from the ZX-81 and C64 in a relatively short period of time.

    So, given a 200 year time span, what will we be dealing with, and still dismissing as too slow, expensive and what have you.

    I wish time travel was more affordable so I could have a look :-)
    11
  • Other Comments
  • Don't you just wish you could take a peek at the computer news in say the year 2212?

    We are all quick to dismiss this CPU and that storage device as too slow, too expensive and too whatever. But we have come a long way from the ZX-81 and C64 in a relatively short period of time.

    So, given a 200 year time span, what will we be dealing with, and still dismissing as too slow, expensive and what have you.

    I wish time travel was more affordable so I could have a look :-)
    11
  • If we could look 200 years into the future I don't think we would even understand what we were looking at concerning computers.
    21
  • Sooner or later, silicon transistor technology will reach its limit, slow down and stop.
    It might happen in the next 50 years or less.

    This doesn't mean we won't produce chips any more,
    but that their quality will be more or less the same:
    certainly not twice smaller and faster every 3 years.

    Mass production of chips based on a completely different techno are still science fiction today.
    -6