H100
Latest about H100

Indian firms secretly funneled AMD, Nvidia AI GPUs to Russia — sanctions reportedly skirted on hundreds of millions of dollars of hardware
By Anton Shilov published
India becomes the second-largest supplier of restricted technology to Russia as Indian companies shipped AMD's Instinct MI300X and Nvidia's H100 processors to Russia.

Distributor claims that Nvidia has allegedly stopped taking orders on HGX H20 GPU processors
By Anton Shilov published
Some dealers in China reportedly cease taking orders on Nvidia's HGX H20 processor.

Intel launches Gaudi 3 accelerator for AI: Slower than Nvidia's H100 AI GPU, but also cheaper
By Anton Shilov published
Intel formally introduces Gaudi 3 AI accelerators, claiming massive price and TCO advantages over Nvidia's H100.

Nvidia publishes first Blackwell B200 MLPerf results: Up to 4X faster than its H100 predecessor, when using FP4
By Anton Shilov published
There are quite a few caveats and qualifications to that figure.

Elon Musk shows off Cortex AI supercluster
By Dallin Grimm published
Another of Musk’s new supercomputers makes headway.

Faulty Nvidia H100 GPUs and HBM3 memory caused half of failures during LLama 3 training — one failure every three hours for Meta's 16,384 GPU training cluster
By Anton Shilov published
In a 16,384 H100 GPU cluster, something breaks down every few hours or so. In most cases, H100 GPUs are to blame, according to Meta.

Elon Musk reveals photos of Dojo D1 Supercomputer cluster — roughly equivalent to 8,000 Nvidia H100 GPUs for AI training
By Jowi Morales published
Elon Musk says that he'll have 90,000 Nvidia H100s, 40,000 AI4 chips, and the equivalent of 8,000 H100 GPUs in Dojo D1 processors by the end of 2024.

Elon Musk fires up ‘the most powerful AI cluster in the world’ to create the 'world's most powerful AI' by December
By Mark Tyson published
Memphis Supercluster training started at ~4:20am local time.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.