Elon Musk confirms that Grok 3 is coming soon — pretraining took 10X more compute power than Grok 2 on 100,000 Nvidia H100 GPUs

Four banks of xAI's HGX H100 server racks, holding eight servers each.
(Image credit: ServeTheHome)

Elon Musk has announced that xAI's Grok 3 large language model (LLM) has been pretrained, and took 10X more compute power than Grok 2. He did not reveal many details, but based on timing, the Grok 3 LLM was pre-trained on the Colossus supercluster, which contains some 100,000 Nvidia H100 GPUs.

"Grok 3 is coming soon," Elon Musk wrote in an X post. "Pretraining is now complete with 10X more compute than Grok 2."

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.