Nvidia buys AI chip startup Groq's assets for $20 billion in the company's biggest deal ever — Transaction includes acquihires of key Groq employees, including CEO
GroqCloud will continue operations as is.
Nvidia, the largest GPU manufacturer in the world and the linchpin of the AI data center buildout, has entered into a non-exclusive licensing agreement with AI chip rival Groq to use the company's intellectual property. The deal is valued at $20 billion and includes acquihires of key employees within the firm who will now be joining Nvidia. The firm spent $7 billion for Israeli chip company Mellanox in 2019, so the record has now been toppled.
Groq is an American AI startup developing Language Processing Units (LPUs) that it positions as significantly more efficient and cost-effective than standard GPUs. Groq's LPUs are ASICs, which are seeing growing interest from many firms due to their custom design, which is better suited to certain AI tasks, such as large-scale inference. Groq argues that it excels in inference, having previously called it a high-volume, low-margin market.
Nvidia is the largest benefactor of the AI boom because it supplies most of the world's data centers and has deals with essentially every AI constituent. Groq has accused Nvidia in the past of predatory tactics over exclusivity, claiming that potential customers remain fearful of Nvidia's inventory allotment if they're found talking to competitors, such as Groq, historically. Those concerns seem to have been laid to bed with the deal.
“We plan to integrate Groq’s low-latency processors into the NVIDIA AI factory architecture, extending the platform to serve an even broader range of AI inference and real-time workloads... While we are adding talented employees to our ranks and licensing Groq’s IP, we are not acquiring Groq as a company.” — Jensen Huang, Nvidia CEO (as per CNBC).
Earlier this year, Groq built its first data center in Europe to counter Nvidia's AI dominance, shaping up to be an underdog story that challenged a behemoth on cost-to-scale. Now, Groq's own LPUs will be deployed in Nvidia's AI factories, as the license covers "inference technology," according to SiliconANGLE.
As part of this transaction, Groq founder and CEO Jonathan Ross and president Sunny Madra will be hired by Nvidia, along with other employees. Ross previously worked at Google, where he helped develop the Tensor Processing Unit (TPU). Simon Edwards, Groq's current finance chief, will step up as the new CEO under this refreshed structure.
Acquiring a company's think tank like this is referred to as an acquihire. Among tech firms, it's a common way to evade antitrust scrutiny while gaining access to a company's assets/IP. Meta's AI hiring sprees also fall under this category, along with Nvidia's recent recruitment of Enfabrica's CEO.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
The announcement characterizes the deal as a non-exclusive agreement, meaning Groq will remain an independent entity, and GroqCloud, the company's platform through which it loans its LPUs, will continue to operate as before. Before this deal, Groq was valued at $6.9 billion in September of this year and was on pace to report $500 million in fiscal revenue.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.
-
jp7189 I've never quite understood groq's magic sauce, but they use on-board sram measured in MB rather than using stacks of expensive HBM measured in GB and still knock it out of the park for LLM specific tasks. Im kinda sad to see Nvidia gobble them up because they seemed to have something truly different, and I'm actually fan of competition.Reply