OpenAI and Broadcom to co-develop 10GW of custom AI chips in yet another blockbuster AI partnership — deployments start in 2026
The AI firm’s latest hardware deal locks in another 10 gigawatts of capacity as it moves to design its own accelerators.

OpenAI has signed a multi-year deal with Broadcom to co-develop and deploy 10 gigawatts of custom AI accelerators and rack systems, the companies announced on October 13. OpenAI will handle accelerator and system design, while Broadcom leads development and roll-out starting in the second half of 2026. Full deployment is targeted by the end of 2029.
The agreement forms part of an ongoing, aggressive hardware push by OpenAI. Unlike with its current reliance on Nvidia GPUs, the new systems will be based on in-house accelerators paired with Broadcom’s networking and hardware IP. The deal could mark a shift away from traditional GPU-centric clusters in favor of tightly integrated silicon tailored to OpenAI’s training and inference workloads.
The two companies have already been working together for over 18 months, and this formal agreement builds on that collaboration. Few technical details have been disclosed, but the joint announcement confirms that the systems will use Ethernet-based networking, suggesting a data-center architecture designed for scalability and vendor neutrality. OpenAI says deployments will be phased over several years, with the first racks going online in the second half of 2026.
The new agreement adds to OpenAI’s existing partnerships with Nvidia and AMD, bringing the company’s total hardware commitments to an estimated 26 gigawatts, including roughly 10 gigawatts of Nvidia infrastructure and an undisclosed slice of AMD’s upcoming MI series.
Interestingly, OpenAI is not believed to be Broadcom’s still-unknown $10 billion customer. Speaking with CNBC, Broadcom semiconductor president Charlie Kawwas appeared alongside OpenAI’s Greg Brockman and joked, “I would love to take a $10 billion [purchase order] from my good friend Greg,” he said, adding, “He has not given me that PO yet.” WSJ reports that the deal is worth "multiple billions of dollars."
OpenAI stands to gain a deep bench in ASIC design and proven supply chain maturity from Broadcom. The company already produces custom AI silicon for hyperscale customers, including Google’s TPU infrastructure. By leveraging Broadcom’s Ethernet and chiplet IP, OpenAI gets a path to differentiated hardware without building a silicon team from scratch.
Meanwhile, for Nvidia, the deal adds to a growing list of partial defections among major AI customers exploring in-house silicon. Amazon, Google, Meta, and Microsoft are all now pursuing custom accelerators. What remains to be seen is how well these bespoke solutions perform at scale, and whether vendors like Broadcom can match the ecosystem maturity of CUDA.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Neither company has disclosed foundry partners, packaging flows, or memory choices for the upcoming accelerators. Those decisions will shape delivery timelines just as much as wafer capacity. With deployment still a year out, the clock is ticking.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Luke James is a freelance writer and journalist. Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.
-
JRStern So we still don't know if this is or is not the same deal we heard about a month ago, but a month ago it was measured in gigadollars, and now this one is measured in gigawatts.Reply
I'm gonna assume it is the same one, just a PR victory, until someone tells me otherwise. -
HubersPaul15
Broadcoms chip division President came on CNBC for an interview and said that OpenAI is not the mystery $10B chip customer from the last earnings report.JRStern said:So we still don't know if this is or is not the same deal we heard about a month ago, but a month ago it was measured in gigadollars, and now this one is measured in gigawatts.
I'm gonna assume it is the same one, just a PR victory, until someone tells me otherwise. -
JRStern
OK, thanks, and I guess it is in the article after all, I should read more carefully.HubersPaul15 said:Broadcoms chip division President came on CNBC for an interview and said that OpenAI is not the mystery $10B chip customer from the last earnings report.
I'm still left with a lot of free-floating skepticism, but ok.