Nvidia reportedly races to secure memory supply for next-gen H200 AI GPUs — pre-purchases $1.54 billion in HBM3E memory

SK Hynix
(Image credit: SK Hynix)

If you sell more processors for artificial intelligence (AI) and high-performance computing (HPC) applications than anyone else in the industry and want it to stay this way, you need to ensure a steady supply of your products. This is exactly what Nvidia does: it not only pre-purchases TSMC's wafer capacity and packaging capacity, but it also secures the supply of HBM3 memory. Korean publication Chosun Biz reports that the company pre-purchased over $1.3 billion of HBM3 memory from Micron and SK Hynix.

According to the publication's conversations with industry insiders, Nvidia has made upfront payments ranging from 700 billion to 1 trillion Korean won (about $540 to $770 million) to Micron and SK Hynix. While there is no specific information about the designation of the payments, which could range between $1.080 billion and $1.54 billion, it is widely speculated within the industry that the purpose of the payments is to ensure a steady supply of HBM3E memory for its upcoming 2024 AI and HPC GPU releases.

Keeping in mind that Nvidia's AI and HPC GPUs are sold out for quarters to come, the company needs to ensure a steady supply of memory for its H100, H200, GH200, and other products that use HBM3 or HBM3E.

TOPICS
Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.