Micron starts to ship samples of HBM4 memory to clients — 36 GB capacity and bandwidth of 2 TB/s

Micron's HBM4
(Image credit: Micron)

Micron has started shipping samples of its next-generation HBM4 memory to key customers, the company announced this week. The new memory assemblies for next-generation AI and HPC processors feature a 36 GB capacity and offer bandwidth of 2 TB/s.

Micron's HBM4 first samples are 12-High devices packing 36 GB of memory and featuring a 2048-bit wide interface as well as a data transfer rate of around 7.85 GT/s. The samples rely on 24GB DRAM devices made on the company's 1ß (1-beta) DRAM process technology as well as logic base dies produced by TSMC using its 12FFC+ (2nm-class) or N5 (5nm-class) logic process technology.

Micron's current-generation HBM3E memory also offers capacities of up to 36 GB, but it features a 1024-bit interface and a data transfer rate of up to 9.2 GT/s, thus providing peak bandwidth of up to 1.2 TB/s. That said, the new HBM4 from Micron can boast an over 60% higher bandwidth as well as up to 20% higher energy efficiency. In addition, Micron's HBM4 also includes a built-in memory test feature to simplify integration for partners.

Micron is the industry's first DRAM maker to officially start sampling of HBM4 memory modules with partners, though expect other manufacturers like Samsung and SK hynix to catch up shortly. Micron and other makers of memory intend to start volume production of HBM4 sometime in 2026, when leading developers of AI processors start volume production of their next-generation processors.

It is expected that Nvidia's codenamed Vera Rubin GPUs for datacenters will be among the first products to adopt HBM4 in late 2026, though they will certainly not be the only AI and HPC processors that use HBM4. 

"Micron HBM4's performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers’ next-generation AI platform readiness to ensure seamless integration and volume ramp."

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.