As chip design is getting costlier and longer to develop, chip designers like AMD are turning to AI to optimize their spending and speed up time-to-market. By now, over 200 chip designs placed-and-routed using the Synopsys DSO.ai electronic design automation (EDA) software suit have been taped out and the number is growing rapidly.
“By the end of 2022, adoption, including 9 of the top 10 semiconductor vendors have moved forward at great speed with 100 AI-driven commercial tape-outs,” said Aart J. de Geus, chef executive of Synopsys at the most recent earnings call (via Yahoo! Finance). “Today, the tally is well over 200 and continues to increase at a very fast clip as the industry broadly adopts AI for design from Synopsys.”
Growing complexity of chips require designers to adopt the latest nodes to make them viable, which is why development and production costs are skyrocketing. A moderately complex chip manufactured using a 7nm process technology came with a development price tag of approximately $300 million, with nearly 40% of this cost attributed to software. By contrast, a development cost of an advanced 5nm exceeds $540 million, including software costs, based on data from International Business Strategies (IBS). Moving forward, the development cost of a sophisticated 3 nm GPU is projected to be around $1.5 billion, with software costs accounting for about 40% of this price tag.
When you spend $1.5 billion on a chip, there is no place for mistake and human beings, unlike AI, are prone to make mistakes, which is why it makes a great sense to use artificial intelligence for highly complex designs. In fact, Synopsys announced a full stack of AI-assisted designs tools earlier this year.
“We unveiled the industry's first full stack AI-driven EDA suite, sydnopsys.ai,” said de Geus. “Specifically, in parallel to second-generation advances in DSO.ai we announced VSO.ai, which stands for verification space optimization; and TSO.ai, test space optimization. In addition, we are extending AI across the design stack to include analog design and manufacturing.”
Virtually all large chipmakers are now adopting AI-assisted EDA tools, though not everyone is ready to confirm this.
“Partners in the announcement included Nvidia, TSMC, MediaTek, Renesas and IBM Research, all providing stunning use cases of the rapid progress and criticality of Synopsys.ai to deliver their breakthrough results,” added de Geus.
In short as a tool in every step rather than to completely take over any step.
Specially when its not true AI... and it only spits things it has been fed with.
Even if AI were to make another AI, it's still flawed.
Every chip has functional blocks or sub systems. Placing these systems by hand were guesses like which would have the least signal propagation time, or best path for high current passageways. Apu here, look ahead buffer there. Decoder here, cache there, ooe rendering ther, security there, memory controller there....
There are tons of these logic blocks. They used to be placed by hand with some intuition based off experience and some rough numbers. AI does a better job at placement.
Sort of like how programmers tend to use compilers, instead of hand-coding assembly language. Not a perfect analogy, but the point is that nobody is taking compiler output as a "first pass" and then tweaking it by hand.
The resulting design is already subjected to a validation process, when humans do it. So, they would simply do the same with AI-driven process.
Human fallibility is no virtue, in engineering or manufacturing.