Buyers of Nvidia's highest-end H100 AI GPU are reportedly reselling them as supply issues ease

Nvidia GH200 SC23 Announcement
(Image credit: Nvidia)

Evidence mounts that lead times for Nvidia's H100 GPUs commonly used in artificial intelligence (AI) and high-performance computing (HPC) applications have shrunken significantly from 8-11 months to just 3-4 months. As a result, some companies who had bought ample amounts of H100 80GB processors are now trying to offload them. It is now much easier to rent from big companies like Amazon Web Services, Google Cloud, and Microsoft Azure. Meanwhile, companies developing their own large language models still face supply challenges.

The Information reports that some companies are reselling their H100 GPUs or reducing orders due to their decreased scarcity and the high cost of maintaining unused inventory. This marks a significant shift from the previous year when obtaining Nvidia's Hopper GPUs was a major challenge. Despite improved chip availability and significantly decreased lead times, the demand for AI chips continues to outstrip supply, particularly for those training their own LLMs, such as OpenAI, according to The Information.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • ivan_vy
    looks like the 7 trillion would not be required after all
    Reply
  • Li Ken-un
    Admin said:
    Some AI startups offload Nvidia H100 amid supply situation ease
    Definitely not to eBay I presume. 🤔 And supposing it were to end up on eBay, probably nigh unaffordable for us peons.
    Reply
  • renz496
    New era of scalping.
    Reply