OpenAI
Latest about OpenAI

Elon Musk plans to scale the xAI supercomputer to a million GPUs — currently at over 100,000 H100 GPUs and counting
By Anton Shilov published
xAI to scale the Colossus supercomputer to one million processors, which could create the most powerful machine in the world.

OpenAI execs mused over Cerebras acquisition in 2017
By Anton Shilov published
To reduce reliance on Nvidia, OpenAI considered to take over Cerebras.

Musk's concerns over Google DeepMind 'AI Dictatorship' revealed in emails from 2016
By Jowi Morales published
Elon Musk's lawsuit against OpenAI has revealed a lot of juicy emails detailing the thoughts of the nonprofit's leadership.

Chinese company trained GPT-4 rival with just 2,000 GPUs
By Anton Shilov published
According to the founder, 01.ai has trained its high-performing AI model with just $3 million.

AWS CEO estimates large city scale power consumption of future AI model training tasks
By Jowi Morales published
AWS CEO Matt Garman estimates that future LLM training would require up to five gigawatts, and AWS is investing in alternative renewable sources to ensure it would have power available when needed.

ChatGPT-5 won't be coming this year
By Aaron Klotz published
OpenAI CEO Sam Altman confirmed that ChatGPT-5 won't be coming this year, due to development and resource limitations. Instead it will be arriving early next year.

Chinese researchers build military AI using Meta’s open-source Llama model
By Jowi Morales published
Chinese researchers built the ChatBIT AI that focuses on intelligence gathering and processing using Meta's Llama 13B model.

Meta is using more than 100,000 Nvidia H100 AI GPUs to train Llama-4
By Jowi Morales published
Mark Zuckerberg says that Meta is training its Llama-4 models on a cluster with over 100,000 Nvidia H100 AI GPUs.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.