'Everyone and Their Dog is Buying GPUs,' Musk Says as AI Startup Details Emerge

image of brain over circuit board
(Image credit: Shutterstock)

Elon Musk has confirmed that his companies Tesla and Twitter were buying tons of GPUs when asked to confirm whether he was building up Twitters compute prowess to develop a generative artificial intelligence project. Meanwhile, the Financial Times reports that Musk's AI venture will be a separate entity from his other companies, but it could use Twitter content for training.

Elon Musk's AI project, which he began exploring earlier this year, is reportedly separate from his other companies, but could potentially use Twitter content as data to train its language model and tap into Tesla's computing resources, according to Financial Times. This somewhat contradicts the earlier report which claimed that the AI project would be a part of Twitter. 

To build up the new project, Musk is recruiting engineers from top AI companies, including DeepMind, and has already brought on Igor Babuschkin from DeepMind and approximately half a dozen of other AI specialists.

Musk is also reportedly negotiating with various SpaceX and Tesla investors about the possibility of funding his latest AI endeavor, according to an individual with firsthand knowledged about the talks, which may confirm that the project is not set to be a part of Twitter.

In a recent Twitter Spaces interview, Musk was asked about a report claiming that Twitter had procured approximately 10,000 of Nvidia compute GPUs. Musk acknowledged this stating that everyone, including Tesla and Twitter, are buying GPUs for compute and AI these days. This is true as both Microsoft and Oracle have acquired tens of thousands of Nvidia's A100 and H100 GPUs in the recent quarters for their AI and cloud services. 

"It seems like everyone and their dog is buying GPUs at this point," Musk said. "Twitter and Tesla are certainly buying GPUs."

Nvidia's latest H100 GPUs for AI and high-performance computing (HPC) are quite expensive. CDW sells Nvidia's H100 PCIe card with 80GB of HBM2e memory for as much as $30,603 per unit. On Ebay, these things sell for over $40,000 per unit if one wants this product fast. 

Recently Nvidia launched its even more powerful H100 NVL product that bridges two H100 PCIe cards with 96GB of HBM3 memory on each one for an ultimate dual-GPU 188GB solution designed specifically for training of large language models. This product will certainly cost well above $30,000 per unit, though it is unclear at which price Nvidia sells such units to customers buying tens of thousands of boards for their LLM projects.

Meanwhile, the exact position of the AI team in Musk's corporate empire remains unclear. The renowned entrepreneur established a company called X.AI on March 9th, Financial Times reported citing business records from Nevada. Meanwhile, he recently changed the name of Twitter in the company's records to X Corp., which may be a part of his plot to build an 'everything app' under the 'X' brand. Musk is currently the sole director of X.AI, while Jared Birchall, who happens to manage Musk's wealth, is listed as its secretary. 

The rapid progress of OpenAI's ChatGPT, which Elon Musk co-founded in 2015 but no longer is involved with, reportedly inspired him to explore the idea of a rival company. Meanwhile, this new AI venture is expected to be a separate entity from his other companies possibly to ensure that this new project will not be limited by Tesla's or Twitter's frameworks.

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • lmcnabney
    If everybody and their dog were buying GPUs than both AMD and Nvidia wouldn't be slashing production orders with TSMC.

    Elon is just distracted by the new toy because his last new toy (Twitter) isn't very much fun anymore.
    Reply
  • hotaru251
    lmcnabney said:
    If everybody and their dog were buying GPUs than both AMD and Nvidia wouldn't be slashing production orders with TSMC.
    Elon likely didn't mean consumer type gpu you'd game on. (thats peanuts to nvidia overall)
    ai focused & data center ones are selling.
    datacenter/auto are still high.

    Reply
  • sam buddy
    Admin said:
    "Meanwhile, the Financial Times reports (opens in new tab) that Musk's AI venture will be a separate entity from his other companies, but it could use Twitter content for training."
    Now, THAT would be fun! Can't wait.
    Reply
  • lmcnabney
    hotaru251 said:
    Elon likely didn't mean consumer type gpu you'd game on. (thats peanuts to nvidia overall)
    ai focused & data center ones are selling.
    datacenter/auto are still high.

    This may not be obvious, but a business can make more money selling less product. Keeping prices high means really fat margins. Would you rather make $300/GPU on average and sell a million or make $100/GPU on average and sell two million? The executives will happily sacrifice volume for margin.
    Reply
  • UWguy
    Elon the grifter. Trying to use his influence to artificially raise the prices of GPUs. Pretty sad attempt.
    Reply
  • bit_user
    lmcnabney said:
    This may not be obvious, but a business can make more money selling less product. Keeping prices high means really fat margins. Would you rather make $300/GPU on average and sell a million or make $100/GPU on average and sell two million? The executives will happily sacrifice volume for margin.
    It all depends on how price-sensitive demand is. When you talk about something like oil, demand is relatively inelastic. A lot of individuals and businesses simply need gasoline, jet fuel, heating oil, etc. so OPEC can get away with cutting production by a little bit and watch prices skyrocket.

    For AI, we're talking about businesses which hope to turn compute power into $ (not unlike crypto), so you have a RoI associated with spending on compute. If Nvidia's compute is too expensive, people will either look elsewhere or just put their AI plans on hold.

    The way it works out is that it's in Nvidia's best interest to produce as many GPUs as needed to keep up with demand, especially while they're offering the market-leading option. Artificially limiting demand might not boost prices enough to offset the drop in volume, and will ultimately benefit alternatives, like AMD, Intel, Cerebras, Graphcore, Tenstorrent, and dozens of other companies trying to play in this space.
    Reply
  • bit_user
    UWguy said:
    Elon the grifter. Trying to use his influence to artificially raise the prices of GPUs. Pretty sad attempt.
    Huh? What does he stand to gain by doing that?
    For me, what's surprising is that he's not using Tesla's Dojo supercomputer. I wonder if that's because it's not as good at Transformer networks, or just because they can't build it up fast enough to accommodate the additional demand.
    Reply
  • domih
    Indeed, the NVIDIA H100 80GB HBM2e is simply unobtainium for individual developers and SOHO.

    On NewEgg you can get one for $42,000 (https://www.newegg.com/p/1VK-0066-00022).

    On eBay the cheapest is $41,500.

    Stratospheric pricing in the sky over Fantasy Land.

    At least it suggests that graphical GPU for low-life are way too cheap compared to compute GPU :)

    Stop complaining!

    /s
    Reply
  • hotaru251
    UWguy said:
    to artificially raise the prices of GPU
    Jensen already doing this w/o Elon.

    bit_user said:
    hat's surprising is that he's not using Tesla's Dojo supercomputer.
    his was built for FP16 data sets.

    my limited info on that stuff is its good to use after you have a development of soemthing but not best when you want to make something. (as less accuracy for the increased speed) where as nvidia's are less focused and have benefit for his current needs.
    Reply
  • bit_user
    domih said:
    Indeed, the NVIDIA H100 80GB HBM2e is simply unobtainium for individual developers and SOHO.

    On NewEgg you can get one for $42,000 (https://www.newegg.com/p/1VK-0066-00022).

    On eBay the cheapest is $41,500.

    Stratospheric pricing in the sky over Fantasy Land.
    In the last thread on Elon's big GPU purchase, I looked at how much Dell wanted for adding it to their standard 2U server platform (PowerEdge R750xa):
    "They want an absolutely astounding $86,250 per H100 PCIe card, and they make you add a minimum of 2 GPUs to the chassis!!!"Update: today, I see they've dropped the price to only $54,329.13 each! That puts the minimum configured price at an unthinkable $121,242.99, and they have the absolute gall to claim that's after $71,235.83 in savings!

    Having a decent amount of experience with Dell servers at my job, I know they like big markups for add-ons, but I'm still pretty stunned by that one.

    The takeaway is: never complain about price-gouging, until you see the Dell price!
    Reply