AI Chip-Layout Tool Has Helped Design Over 100 Chips

Synopsys DSO.ai chip PPA tool
(Image credit: Synopsys)

Synopsys' AI solution for semiconductor design has achieved the milestone of 100 commercial chip tape-outs. The company said in a press release that customers such as STMicroelectronics and SK hynix have enjoyed up to 3x productivity increases, seeing chips with up to 25% lower total power and significant reduction in die size thanks to Synopsys DSO.ai (Design Space Optimization AI).

To be clear, AI isn't stealing chip designers' jobs: Synopsys prefers to say that, thanks to its software, human chip designers and hardware engineers are freed from iterative work, and, thanks to AI augmentation, are able to instead work on innovation.

“With reduced design and verification cycles and effort, design teams can spend more of their time innovating on their core ideas,” Synopsys said. It hopes the engineering talent shortage can at least be somewhat relieved by AI. 

The milestone achieved by Synopsys shows that AI use in electronic design automation is rapidly becoming mainstream. Moreover, this AI can be particularly useful to industrial sectors increasingly looking to enter the chip design business — auto makers, for example. Synopsys even goes so far as describing DSO.ai as an “expert engineer in a box.”

So, what exactly does Synopsys DSO.ai do? The biggest clue is in the acronym: Design Space Optimization — by AI. The tool takes care of the floor-planning for a new chip (or iterations of it). Synopsys says DSO.ai is a great fit for the trending multi-die silicon designs, which would involve a high volume of repetitive tasks for humans to plan. 

To complete its task, the AI software optimizes power, performance, and area (PPA) for any given chip design space. Working on PPA is a proven route to doing better with fewer resources, and has been a very popular target for optimizing during recent years with shortages of key materials due to cryptocurrencies and the pandemic.

Synopsys DSO.ai chip PPA tool

(Image credit: Synopsys)

Synopsys customers have been reaping the benefits of DSO.ai with impressive results claimed.  Synopsys asserts that its customers have seen productivity boosts of more than 3x, power reductions of up to 15%, substantial die size reductions in finished designs, and reduced use of resources. It also suggests an ideal task for AI is facilitating multi-foundry strategies to mitigate the impact of supply chain vulnerabilities and to lower costs.

Synopsys is already looking at broadening the use of AI in other chip design and verification workflows. It seems we may be seeing a breakthrough moment in AI-based chip design, and it's interesting to see this news as consumer-facing AIs from Google, Microsoft, and OpenAI are also making headlines.

Mark Tyson
Freelance News Writer

Mark Tyson is a Freelance News Writer at Tom's Hardware US. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.

  • bit_user
    seeing chips with up to 25% lower total power and significant reduction in die size thanks to Synopsys DSO.ai
    Sounds like almost a node's worth of improvement, just due to smarter planning & layout. I'm sure it'll have its detractors, but it seems to me pretty much like a pure win.

    It's not as if we didn't have automated layout tools for decades. It's just that conventional routing & placement algorithms weren't comparable to expert human layout engineers. It seems that now, the tables have finally turned.
    Reply
  • InvalidError
    bit_user said:
    It's not as if we didn't have automated layout tools for decades. It's just that conventional routing & placement algorithms weren't comparable to expert human layout engineers.
    Even experts cannot realistically manage billions of transistors in an optimal manner like EDA tools can... and even when you successfully design a theoretically optimal silicon function block by hand, that design won't be optimal once integrated in something else where adjustments need to be made to input/output orientation, timing margins for transit, etc.
    Reply
  • Amdlova
    Remember bulldozer :) has some ia behind it...
    https://megagames.com/news/ex-amd-engineer-reveals-why-bulldozer-design-inferior-intel%E2%80%99s-sandy-bridge
    Reply
  • bit_user
    Amdlova said:
    Remember bulldozer :) has some ia behind it...
    https://megagames.com/news/ex-amd-engineer-reveals-why-bulldozer-design-inferior-intel’s-sandy-bridge
    I'd heard that, but couldn't find a source when I was trying to look it up some years ago. So, thanks for the link!

    But no, that wasn't AI. That was the inferior, conventional ASIC layout & routing tools I was talking about. Traditionally, if you wanted a top-performing CPU, you'd do a "full custom" layout, like the article says. AMD didn't do that with Bulldozer, also as discussed in the article.

    See? It's a hard problem we couldn't really crack until deep learning came along.
    Reply
  • btmedic04
    there are literally 6 terminator movies as to why this is a frightening idea :eek::ROFLMAO:
    Reply
  • bit_user
    btmedic04 said:
    there are literally 6 terminator movies as to why this is a frightening idea :eek::ROFLMAO:
    Automated chip layout tools strike me as one of the less scary applications of AI. It's not as if the design that comes out isn't subject to intense validation - it is. The last thing any chip maker wants to do is blow vast sums of money on a production run of defective chips. Even worse, if they ship to customers and you have to do an expensive RMA.

    You don't need general artificial intelligence to do something like this, much like honeybees don't need general intelligence to build their hives.
    Reply
  • InvalidError
    bit_user said:
    You don't need general artificial intelligence to do something like this, much like honeybees don't need general intelligence to build their hives.
    Though Skynet would put it in there given a chance - engineer backdoors into stuff at the hardware level for it to for propagating itself.
    Reply
  • bit_user
    InvalidError said:
    Though Skynet would put it in there given a chance - engineer backdoors into stuff at the hardware level for it to for propagating itself.
    By the time a general-purpose AI gets smart enough to hack our EDA software to engineer in backdoors into the chips it produces without us even noticing, I think we'll already have bigger problems to worry out!

    There was a TV series, a couple years ago, that painted a scenario where a rogue AI hacked Alexa and surveillance networks, and simply used social engineering to make people do its bidding. It's a little bit sensationalist, but it pretty effectively shows how much damage an AI (or hackers, or a hostile government actor) can do by exploiting our existing systems. If that sounds interesting, then you'll probably find it worth watching.

    https://www.imdb.com/title/tt9315054/?ref_=fn_al_tt_9
    Reply
  • Geef
    We all know Mark Tyson is an 👨‍🎤 AI Cyborg. The reason his pic shows him bald is because it was taken shortly after he had the AI chip implanted in his brain! 🙆‍♂️
    Reply
  • TerryLaze
    Al is not A.I.
    uq-gYOrU8bAView: https://www.youtube.com/watch?v=uq-gYOrU8bA
    Reply