Grok
Latest about Grok

Elon Musk's xAI raises $6 billion to build more powerful AI supercomputers
By Anton Shilov published
xAI, which is set to build a supercomputer with 200,000 Nvidia GPUs, raises $6 billion.

Elon Musk plans to scale the xAI supercomputer to a million GPUs — currently at over 100,000 H100 GPUs and counting
By Anton Shilov published
xAI to scale the Colossus supercomputer to one million processors, which could create the most powerful machine in the world.

Elon Musk's xAI reportedly shifts $6 billion AI server order from troubled Supermicro to its rivals
By Anton Shilov published
Dell, Inventec, and Wistron land new orders from xAI as Supermicro faces significant financial challenges.

Elon Musk's massive AI data center gets unlocked
By Jowi Morales published
The Tennessee Valley Authority approved xAI's request for 150MW to power its AI supercomputer used for training Grok.

First in-depth look at Elon Musk's 100,000 GPU AI cluster
By Sunny Grimm published
Now, witness the firepower of this fully armed and operational AI supercluster

Elon Musk set up 100,000 Nvidia H200 GPUs in 19 days - Jensen says process normally takes 4 years
By Aaron Klotz published
Elon Musk and the team behind xAI purportedly setup a total of 100,000 H200 Nvidia GPUs in just 19 days. That's a feat that should have taken four years to complete.

Elon Musk powers new 'World's Fastest AI Data Center" with gargantuan portable power generators to sidestep electricity supply constraints
By Jowi Morales published
Elon Musk deployed 14 mobile generators at the xAI Memphis Supercluster to generate 35 MWe to power 32,000 H100 GPUs.

Elon Musk reveals photos of Dojo D1 Supercomputer cluster — roughly equivalent to 8,000 Nvidia H100 GPUs for AI training
By Jowi Morales published
Elon Musk says that he'll have 90,000 Nvidia H100s, 40,000 AI4 chips, and the equivalent of 8,000 H100 GPUs in Dojo D1 processors by the end of 2024.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.