Micron's new ultra-fast memory to power Nvidia's next-gen AI GPUs — 24GB HBM3E chips put into production for H200 AI GPU

Micron's HBM3E memory chip.
(Image credit: Future)

Micron announced today that it is starting volume production for its HBM3E memory, the company's latest memory for datacenter- and AI-class GPUs. In particular, Micron says its HBM3E is being used for Nvidia's upcoming H200, which is slated to launch in the second quarter of the year with six HBM3E memory chips. Additionally, Micron also said it would detail its upcoming 36GB HBM3E chips in March at Nvida's GTC conference.

Compared to regular HBM3, HBM3E boosts bandwidth from 1TB/s to up to 1.2TB/s, a modest but noticeable jump in performance. HBM3E also increases max capacity per chip to 36GB, though Micron's HBM3E for Nvidia's H200 will be 'just' 24GB per chip, on par with HBM3's maximum. These chips also support eight-high stacks, rather than the maximum 12 HBM3E can theoretically offer. Micron claims its particular HBM3E memory has 30% lower power consumption than its competitors, Samsung and Sk hynix, which also have their own implementations of HBM3E.

TOPICS
Matthew Connatser

Matthew Connatser is a freelancing writer for Tom's Hardware US. He writes articles about CPUs, GPUs, SSDs, and computers in general.