Nvidia reportedly races to secure memory supply for next-gen H200 AI GPUs — pre-purchases $1.54 billion in HBM3E memory

SK Hynix
(Image credit: SK Hynix)

If you sell more processors for artificial intelligence (AI) and high-performance computing (HPC) applications than anyone else in the industry and want it to stay this way, you need to ensure a steady supply of your products. This is exactly what Nvidia does: it not only pre-purchases TSMC's wafer capacity and packaging capacity, but it also secures the supply of HBM3 memory. Korean publication Chosun Biz reports that the company pre-purchased over $1.3 billion of HBM3 memory from Micron and SK Hynix.

According to the publication's conversations with industry insiders, Nvidia has made upfront payments ranging from 700 billion to 1 trillion Korean won (about $540 to $770 million) to Micron and SK Hynix. While there is no specific information about the designation of the payments, which could range between $1.080 billion and $1.54 billion, it is widely speculated within the industry that the purpose of the payments is to ensure a steady supply of HBM3E memory for its upcoming 2024 AI and HPC GPU releases.

Nvidia is ramping up two products featuring HBM3E memory: the H200 AI and HPC GPU featuring 141GB of HBM3E and the GH200 platform featuring a Grace CPU and an H200 GPU outfitted with 141GB of HBM3E memory. Both devices will be popular and require a tremendous amount of memory, so buying in advance makes a lot of sense for Nvidia. 

In fact, it is not uncommon for GPU makers to pre-purchase expensive memory products from their suppliers as it is easier to sell advanced GPUs with memory, particularly to smaller graphics card makers. In the case of AI and HPC GPUs used for computing, Nvidia tends to sell the finished PCIe cards and SXM modules rather than just the GPU die, so it makes sense for Nvidia to procure the HBM3E, too.

Keeping in mind that Nvidia's AI and HPC GPUs are sold out for quarters to come, the company needs to ensure a steady supply of memory for its H100, H200, GH200, and other products that use HBM3 or HBM3E.

It remains to be seen if Micron, SK Hynix, and Samsung will have enough capacity to supply HBM3 and HBM3E memory to other developers of AI and HPC solutions, such as AMD, AWS, and Intel. If they do not, Nvidia will be able to strengthen its position further in the growing AI hardware market in 2024.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.