Elon Musk doubles down on goal of 50 million H100-equivalent GPUs in the next 5 years — Envisions billions of GPUs in the future as Grok 2.5 goes open source

Elon Musk, Grok 3.5, xAI
(Image credit: Shutterstock)

Artificial intelligence has been the flavor of the month ever since ChatGPT ushered in a new big bang for Silicon Valley. The big players involved in this race now are all gunning for ludicrously dense GPU clusters that can scale AI operations rapidly. One of those players is Elon Musk, who owns X and its namesake xAI, which seems determined to one-up Sam Altman and OpenAI, which Musk, interestingly enough, co-founded. Last month, the titular investor said that xAI will use 50 million H100-equivalent GPUs in the coming five years, and yesterday he reiterated that goal.

Yesterday's reply is a simple doubling down on this goal, but more interestingly, Musk teases AI compute at a magnitude not conceived yet by saying, "Eventually, billions." That implies xAI will one day have enough power to match billions of H100 GPUs. Of course, this sounds like a bit of a disconnect from reality, considering how AI is already perpetuating environmental risks and these large datacenters are constantly affecting local populations. There are also thermal and electrical requirements to consider.

Across the pond, Musk's foil, Sam Altman, has highlighted his own mission of over a million H100 GPUs by year-end, along with a vision of bringing 100 million GPUs online by some point. That's going to require as much money as the entire GDP of the UK. In comparison, at the moment, xAI is operating with around 200,000 H200 GPUs, significantly short of the 10 million per year (based on 50 over five) that Musk wants.

Meta is another competitor in this league, and its head honcho, Mark Zuckerberg, shares similar visions. He's building a "Hyperion" data center that's almost as big as Manhattan and will consume up to 5GW of power, which is nearly equal to NYC's base electrical load. In terms of actual compute, Zuckerberg has also promised to break the million AI GPUs barrier by the end of this year, but, more importantly, the company is on the cusp of developing homegrown chips that would end its reliance on foreign manufacturers.

Zuckerberg threads.

(Image credit: Meta)

All of this comes at the same time as xAI's older model, Grok 2.5 has been made open-source. X used Grok 2.5 before its shift to Grok 3, which Musk also says will go open-source in 6 months. Jury is still out on Grok 4, though, as the AI has been recently doused in a wave of controversy and isn't exactly enjoying the best reputation right now. Sure, some bad actors would be aching to get their hands on this tech, but we can only hope that the hundreds of billions Mr. Musk is planning to spend on expanding his AI clusters can produce a bot that doesn't call itself MechaHitler.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

TOPICS
Hassam Nasir
Contributing Writer

Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.

  • JRStern
    Both nutz.
    "Scale, scale, scale" has broken, nobody needs that much compute,
    New techniques will arrive that will obsolete current hardware one way or another.

    Meanwhile freebie Grok has apparently been downgraded to version 3 operating in a new "thinking" mode, 50x slower than the old direct mode. Is it better? I can't really say. While still somewhat useful or amusing it is well short of the freebie ChatGPT which is version 5.

    I will probably try the paid version of ChatGPT pretty soon, I do keep hitting the session limits on the freebie.
    Reply