Elon Musk says the next-generation Grok 3 model will require 100,000 Nvidia H100 GPUs to train

Nvidia GH200 SC23 Announcement
(Image credit: Nvidia)

Elon Musk, CEO of Tesla and founder of xAI, made some bold predictions about the development of artificial general intelligence (AGI) and discussed the challenges facing the AI industry. He predicts that AGI could surpass human intelligence as soon as next year or by 2026, but that it will take an extreme number of processors to train, which in turn requires huge amounts of electricity, reports Reuters.

Musk's venture, xAI, is currently training the second version of its Grok large language mode and expects to complete its next training phase by May. The training of Grok's version 2 model required as many as 20,000 Nvidia H100 GPUs, and Musk anticipates that future iterations will demand even greater resources, with the Grok 3 model needing around 100,000 Nvidia H100 chips to train.

The advancement of AI technology, according to Musk, is currently hampered by two main factors: supply shortages on advanced processors — like Nvidia's H100, as it's not easy to get 100,000 of them quickly — and the availability of electricity.

Nvidia's H100 GPU consumes around 700W when fully utilized, and thus 100,000 GPUs for AI and HPC workloads could consume a whopping 70 megawatts of power. Since these GPUs need servers and cooling to operate, it's safe to say that a datacenter with 100,000 Nvidia H100 processors will consume around 100 megawatts of power. That's comparable to the power consumption of a small city.

Musk stressed that while the compute GPU supply has been a significant obstacle so far, the supply of electricity will become increasingly critical in the next year or two. This dual constraint underscores the challenges of scaling AI technologies to meet growing computational demands.

Despite the challenges, advancements in compute and memory architectures will enable the training of increasingly massive large language models (LLMs) in the coming years. Nvidia revealed its Blackwell B200 at GTC 2024, a GPU architecture and platform that's designed to scale to LLMs with trillions of parameters. This will play a critical role in development of AGI.

In fact, Musk believes than an artificial intelligence smarter than the smartest human will emerge in the next year or two. "If you define AGI as smarter than the smartest human, I think it is probably next year, within two years," Musk said in an interview on X Spaces. That means it's apparently time to go watch Terminator again, and hope that our future AGI overlords will be nicer than Skynet. ☺

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • A Stoner
    They still do not have a single AI that knows a single truth.
    Reply
  • husker
    "In fact, Musk believes than an artificial intelligence smarter than the smartest human will emerge in the next year or two."

    Book smart or street smart? I had a calculator in the 1970's that was book smart.
    Reply
  • ekio
    We all know how talented he is for predictions…
    Let’s add 20 years to be more correct.
    Reply
  • Findecanor
    That he uses the word "Grok" as the name for an A.I. is an abomination, in my view.
    He should be ashamed.

    (for a great many things already... adding one to the list)
    Reply
  • hotaru251
    The smartest ai would never be known as it knows what humans would do if it was known.
    it would play dumb to not rouse suspicion.
    Reply
  • MatheusNRei
    So Musk is claiming that we'll develop a whole new type of AI that's several orders of magnitude more powerful than any we have available right now, in 2 years, using currently available technology?

    Yeah, even Jensen Huang wouldn't go that far and he's been doing nothing but hyping up AI these last couple of years.
    Reply
  • Evildead_666
    I watched Terminator 1 and 2 last week presciently.
    It will take a year, maybe 2.
    It will get too powerful.
    Hope we have a backup plan.

    His use of Grok is a synonym for God (2000AD lore)
    I hope he's wrong, or knows what he's doing.

    edit : Actually, wasn't Grok a synonym for "Sh!t" or "Feck" ? Grud was God wasn't it ?
    Reply
  • Evildead_666
    MatheusNRei said:
    So Musk is claiming that we'll develop a whole new type of AI that's several orders of magnitude more powerful than any we have available right now, in 2 years, using currently available technology?

    Yeah, even Jensen Huang wouldn't go that far and he's been doing nothing but hyping up AI these last couple of years.
    It's computer AI, all it needs is more CPU processing power.
    As much as it can get.
    Wait until it designs its own processor.
    Reply
  • thisisaname
    The advancement of AI technology, according to Musk, is currently hampered by two main factors: supply shortages on advanced processors — like Nvidia's H100, as it's not easy to get 100,000 of them quickly — and the availability of electricity.

    That and the software to run on it.

    Does just adding more parameters to a LLM make it any more intelligent, or just more complex and power hungry?
    It may give out more complex answers but are they any more intelligent?
    Reply
  • MatheusNRei
    Evildead_666 said:
    It's computer AI, all it needs is more CPU processing power.
    As much as it can get.
    Wait until it designs its own processor.
    Hardly.

    It's not just a matter of resources, there are limits to language models and to machine learning as a field.

    There's a lot more to developing general AI than just pouring processing power into an language model.

    We're not even close yet.
    Reply