Sign in with
Sign up | Sign in

Researchers Create 22nm Indium Gallium Arsenide Transistors

By - Source: MIT | B 17 comments

Indium gallium arsenide is gaining traction as a potential successor of silicon in semiconductors.

Following a 20 nm transistor announcement from Purdue University, researchers at MIT's Microsystems Technology Laboratories said they successfully created a 22 nm indium gallium arsenide compound transistor.

As semiconductor manufacturing processes transition to smaller structures, researchers believe that Silicon will eventually hit a limit at which it cannot be scaled anymore. Indium gallium arsenide is considered a potential candidate to replace silicon at the 10 nm and below level. MIT said that the material is already used in fiber-optic communication and radar technologies, and is known to have extremely good electrical properties. Recent successes to shrink transistors using the compound suggest that the industry is working toward a viable solution. The 10 nm mark is expected to be reached and surpassed in semiconductor manufacturing in the 2017/2018 time frame.

"We have shown that you can make extremely small indium gallium arsenide MOSFETs with excellent logic characteristics, which promises to take Moore's Law beyond the reach of silicon," said Jesús del Alamo, co-developer of the transistors.

The researchers said that many of the techniques used to make the indium gallium arsenide transistors are in use in current silicon-based chip manufacturing. Even if the techniques have not been used for compounds, del Alamo believes that current production technologies need to be adopted.

"When you are talking about integrating billions of tiny transistors onto a chip, then we need to completely reformulate the fabrication technology of compound semiconductor transistors to look much more like that of silicon transistors," del Alamo said.

He said that the team will now be focusing on improving the electrical performance of their transistors, and further shrink the transistor gate length down to 10 nm.


 

Contact Us for News Tips, Corrections and Feedback

Display 17 Comments.
This thread is closed for comments
Top Comments
  • 10 Hide
    madjimms , December 13, 2012 6:18 PM
    AMD better jump on board, That way we will ALL see some benefits. :-)
Other Comments
  • 1 Hide
    killabanks , December 13, 2012 6:08 PM
    just hope its as cheap as silicon or we might see a price bump
  • 10 Hide
    madjimms , December 13, 2012 6:18 PM
    AMD better jump on board, That way we will ALL see some benefits. :-)
  • -2 Hide
    cbrunnem , December 13, 2012 6:26 PM
    purdue wins!
  • -4 Hide
    deksman , December 13, 2012 7:53 PM
    Sythentic diamonds and carbon nanotubes could have been used in late 1990-ies to go lower than what they are indicating (10nm).
    Besides, Graphene was stated to be ideal to go lower than 10nm.
    The only reason we are seeing talks of this kind of reduction NOW is because its cheaper than it was over a decade ago from a $$ point of view, even though we had the technology and resources to do it back then - it just wasn't 'cost friendly').

    Money is slowing all of this down on the commercial end - its disgusting.
  • 1 Hide
    clonazepam , December 13, 2012 7:55 PM
    Quote:
    the team will now be focusing on improving the electrical performance of their transistors...

    Does that mean its electrical performance isn't on par with silicon yet? Or, unspecified?
  • -1 Hide
    A Bad Day , December 13, 2012 7:55 PM
    clonazepamDoes that mean its electrical performance isn't on par with silicon yet? Or, unspecified?


    What's wrong with making the electrical performance better than silicon?
  • -3 Hide
    scotters , December 13, 2012 8:24 PM
    cool tech and stuff but... benchmarks or GTFO.
  • 2 Hide
    IndignantSkeptic , December 13, 2012 9:22 PM
    @deksman, what social system do you suggest for getting technology to advance at a faster rate? You say Capitalism is the problem, but why?

    Anyway, question for anyone, can this chip material be recycled from older chips to make newer chips?
  • 1 Hide
    pjmelect , December 13, 2012 9:34 PM
    As I understand it Indium Gallium Arsenide can be used to make much faster devices than silicon. I have always wondered why it was not used instead of silicon in making processors. The reason I have been told is that Indium Gallium Arsenide is much more expensive to make.
  • 1 Hide
    A Bad Day , December 13, 2012 9:35 PM
    deksmanSythentic diamonds and carbon nanotubes could have been used in late 1990-ies to go lower than what they are indicating (10nm).Besides, Graphene was stated to be ideal to go lower than 10nm.The only reason we are seeing talks of this kind of reduction NOW is because its cheaper than it was over a decade ago from a $$ point of view, even though we had the technology and resources to do it back then - it just wasn't 'cost friendly').Money is slowing all of this down on the commercial end - its disgusting.


    If you can't convince investors that you can make a marketable product, then tough luck.

    There are some technologies that need decades of research before it's marketable, such as Li-ion batteries or LCDs.
  • 1 Hide
    A Bad Day , December 13, 2012 10:29 PM
    EDIT: Those two technologies were proposed in the 1950's or 60's.
  • 1 Hide
    Anonymous , December 13, 2012 10:58 PM
    I think a lot of the cost is about Boule size. How big can they currently make igas ones? Especially considering silicon will hopefully be moving to 400mm soon. Diamond would be great considering its excellent electron mobility but again just how big can they make the boules.
  • -1 Hide
    tiret , December 14, 2012 3:53 AM
    you know what gets me: is how convenient that every year they reduce the size of the transistor by say 30%. there hasn't been a year to my knowledge where they hit a road block and there hasn't been a year where they made a major improvement either.

    moral of the story: marketing and planned release of superior tech all to get the most out of us - the consumer.

    maybe I'm wrong but it all seems way too convenient.
  • 1 Hide
    ojas , December 14, 2012 10:27 AM
    Ah. That sweet sound of tick-tock-tick-tock... :D 
  • 2 Hide
    CaedenV , December 14, 2012 12:05 PM
    tiretyou know what gets me: is how convenient that every year they reduce the size of the transistor by say 30%. there hasn't been a year to my knowledge where they hit a road block and there hasn't been a year where they made a major improvement either.moral of the story: marketing and planned release of superior tech all to get the most out of us - the consumer.maybe I'm wrong but it all seems way too convenient.

    there are 2 things at work here;
    1) revolutionary releases are bad for everyone. They take consumers by suprise which pisses off those who just purchase a product and will now no longer have 'the best', and it makes people purchase defensively rather than when they want to purchase. On the business side it makes for 'feast and famine' markets rather than a steady income stream, which makes it a lot harder to budget resources on long term projects. Slow, steady, and predictable releases are good for everyone, and Intel is king of that. We already know quite a bit about the next 4 gens of processors coming from Intel over the next 4-5 years. With AMD you simply never know until a month before, and even then you don't know what to expect from it until 'the next OS release fixes it'. Businesses and consumers would buy AMD if they simply knew what AMD was going to do ahead of time, even if they were not the fastest or best deal around, so long as they can plan their upgrade cycle around it then they would be happy.

    The exception to this rule is if you can come out with a revolutionary product every year, which is what Apple was doing with Jobs. But this is mostly a matter of marketing so the consumer feels good about each release, but without actuially getting something all-together better than the previous release. Then when a truly revolutionary release does come (like the last iPad), then it pisses everyone off.

    2) It is a shift of focus from Intel. With the Pentium 4 Intel was focused on clock speed. They ditched the really great P6 architecture to move to NetBurst specifically to focus on clock speed, and they lost horribly to AMD who showed that you can go much faster with better design than raw horse power. Because of this we saw Intel move from 180nm to 90nm over a 6 year period (00-05), while the die size increased dramatically to some rather huge chips. Then (finally) in '06 Intel's brain turned on and they decided to focus on efficiency. They went back to the P6 core architecture (a glorified Pentium 3), bringing with it all of what they had learned the last 6 years and they found that they could get a sub 2GHz cpu to beat their old 4GHz processors. So the focus was then on core efficiency, which is why we have gone from 65nm to 22nm over the last 6 years, and if you exclude the iGPU then the core CPU is taking extremely small amounts of power compared to the old design.

    But just like the GHz wall that was hit before, now we are fast aproaching the nm wall. There is already talk about delays of Broadwell (14nm) due to manufacture problems, so a new paradigm needs to be focused on. Be it materials, or architecture design and extensions, or something else entirely, who knows. But do not mistake a company's obsession with a single focus (GHz or die shrinks) to saying that a company is 'out to get you' or any such silliness. It is true that they are holding back just enough to keep a steady stream of buyers coming to their door, but they are also doing it because they do not have enough innovation to make something amazing every year. If they did, you could bet they would put it to market to fight off the ARM invasion which will be hitting desktops and laptops in the next year or two.
  • 2 Hide
    TeraMedia , December 14, 2012 2:56 PM
    GaAs-based IC technology has been around for decades. It was used in systems requiring super-high frequency, because it switches faster than Si-based transistors. That's why it was used in applications requiring ultra-high frequencies: fiber optic communications, and radar. And it's always been much more expensive than Si, for a variety of reasons - not the least of which being that its raw materials are highly toxic and not nearly as plentiful as SiO2. I don't know whether the power consumption characteristics of GaAs are similar to Si or not, but I suspect that's what they are trying to improve.

    Creating such small structures with eletrically-functional characteristics using InGaAs instead of Si is impressive.

    Personally, I'd prefer that research head in the carbon molecule direction (grapheme, nanotube, diamonds, whatever) than go with InGaAs (or even just GaAs). In spite of the little ghost-buster trash can logo on most electronics packaging, most electronics still make their way to landfills and I don't relish the idea of Indium, Gallium and Arsenic making their way into our water supplies.

    @deksman: Have you finished your homework assignment yet? As an alternative, you could always try something like crowdfunding.
  • 0 Hide
    foshizz , December 29, 2012 1:31 AM
    Gosh the possibilities in the future really make me believe that Back To the Future is on target with hoverboards by 2015!