AI Tools Take Chip Design Industry by Storm: 200+ Chips Tape Out
More chip designers adopt AI to develop their chips.
As chip design is getting costlier and longer to develop, chip designers like AMD are turning to AI to optimize their spending and speed up time-to-market. By now, over 200 chip designs placed-and-routed using the Synopsys DSO.ai electronic design automation (EDA) software suit have been taped out and the number is growing rapidly.
“By the end of 2022, adoption, including 9 of the top 10 semiconductor vendors have moved forward at great speed with 100 AI-driven commercial tape-outs,” said Aart J. de Geus, chef executive of Synopsys at the most recent earnings call (via Yahoo! Finance). “Today, the tally is well over 200 and continues to increase at a very fast clip as the industry broadly adopts AI for design from Synopsys.”
Growing complexity of chips require designers to adopt the latest nodes to make them viable, which is why development and production costs are skyrocketing. A moderately complex chip manufactured using a 7nm process technology came with a development price tag of approximately $300 million, with nearly 40% of this cost attributed to software. By contrast, a development cost of an advanced 5nm exceeds $540 million, including software costs, based on data from International Business Strategies (IBS). Moving forward, the development cost of a sophisticated 3 nm GPU is projected to be around $1.5 billion, with software costs accounting for about 40% of this price tag.
When you spend $1.5 billion on a chip, there is no place for mistake and human beings, unlike AI, are prone to make mistakes, which is why it makes a great sense to use artificial intelligence for highly complex designs. In fact, Synopsys announced a full stack of AI-assisted designs tools earlier this year.
“We unveiled the industry's first full stack AI-driven EDA suite, sydnopsys.ai,” said de Geus. “Specifically, in parallel to second-generation advances in DSO.ai we announced VSO.ai, which stands for verification space optimization; and TSO.ai, test space optimization. In addition, we are extending AI across the design stack to include analog design and manufacturing.”
Virtually all large chipmakers are now adopting AI-assisted EDA tools, though not everyone is ready to confirm this.
“Partners in the announcement included Nvidia, TSMC, MediaTek, Renesas and IBM Research, all providing stunning use cases of the rapid progress and criticality of Synopsys.ai to deliver their breakthrough results,” added de Geus.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
Jobs eviscerated at Chinese Arm chip design firm in wake of restrictions from TSMC — lack of access to 7nm node could cause 150 employees to be laid off
Chinese chip firms say a new round of US sanctions won’t stop China’s chip industry — Chinese government responds with its own export restrictions anyways
-
NinoPino I understand that AI can help, but someone can explain what manufacturing stage is improved and in which way ?Reply -
tennis2 no place for mistake and human beings, unlike AI, are prone to make mistakes
Bold statement. -
JTWrenn
Probably something that is present in every stage for some part or another. Likely used as double checks or optimizations for layouts, or maybe even as a debugger. Used to recommend things when people get stumped on some part of a project. Used to create rough drafts that are then tweaked by humans. Used to do something complex for a human to apply but that has been done before like say reorganizing a group of data a way it was done before that couldn't be done with a normal algorithm but ai can mimic.NinoPino said:I understand that AI can help, but someone can explain what manufacturing stage is improved and in which way ?
In short as a tool in every step rather than to completely take over any step. -
tamalero
Agree, I always laugh when they claim that "AI" does not make mistakes.tennis2 said:Bold statement.
Specially when its not true AI... and it only spits things it has been fed with. -
Phaaze88 Does not make mistakes, when those who made them are flawed by design... that imperfection is what makes humans perfect.Reply
Even if AI were to make another AI, it's still flawed. -
digitalgriffin
When the machines start designing themselves mankind will begin to fall.Admin said:Synopsys says over 200 chips have been developed using its DSO.ai place and route EDA solution.
AI Tools Take Chip Design Industry by Storm: 200+ Chips Tape Out : Read more -
digitalgriffin JTWrenn said:Probably something that is present in every stage for some part or another. Likely used as double checks or optimizations for layouts, or maybe even as a debugger. Used to recommend things when people get stumped on some part of a project. Used to create rough drafts that are then tweaked by humans. Used to do something complex for a human to apply but that has been done before like say reorganizing a group of data a way it was done before that couldn't be done with a normal algorithm but ai can mimic.
In short as a tool in every step rather than to completely take over any step.
Every chip has functional blocks or sub systems. Placing these systems by hand were guesses like which would have the least signal propagation time, or best path for high current passageways. Apu here, look ahead buffer there. Decoder here, cache there, ooe rendering ther, security there, memory controller there....
There are tons of these logic blocks. They used to be placed by hand with some intuition based off experience and some rough numbers. AI does a better job at placement. -
JTWrenn
Yup, as I said " Likely used as double checks or optimizations for layouts, or maybe even as a debugger"digitalgriffin said:Every chip has functional blocks or sub systems. Placing these systems by hand were guesses like which would have the least signal propagation time, or best path for high current passageways. Apu here, look ahead buffer there. Decoder here, cache there, ooe rendering ther, security there, memory controller there....
There are tons of these logic blocks. They used to be placed by hand with some intuition based off experience and some rough numbers. AI does a better job at placement. -
bit_user
I think that's not accurate. It's probably used for placement & routing, exclusively.JTWrenn said:In short as a tool in every step rather than to completely take over any step.
Sort of like how programmers tend to use compilers, instead of hand-coding assembly language. Not a perfect analogy, but the point is that nobody is taking compiler output as a "first pass" and then tweaking it by hand.
The resulting design is already subjected to a validation process, when humans do it. So, they would simply do the same with AI-driven process.tamalero said:I always laugh when they claim that "AI" does not make mistakes.
Human fallibility is no virtue, in engineering or manufacturing.Phaaze88 said:that imperfection is what makes humans perfect. -
Phaaze88
Yes? I just don't believe in perfection.bit_user said:Human fallibility is no virtue, in engineering or manufacturing.