Nvidia CEO says Samsung HBM3e not yet ready for AI accelerator certification — Jensen Huang suggests more engineering work is required

Jensen at GTC 2024
(Image credit: Nvidia)

Nvidia CEO Jensen Huang says Samsung’s advanced High Bandwidth Memory chips still aren’t ready for official certification. Nvidia’s sign-off is the last step before Samsung can begin supplying HBM3 and HBM3e components, essential to training Nvidia artificial intelligence (AI) platforms.

SK hynix is currently the primary supplier of HBM3 and HBM3e memory to Nvidia. These chips are important for the fast and efficient training of AI models, including ChatGPT and others. Nvidia is examining HBM chips produced by Samsung and Micron, but it hasn’t yet endorsed their usage. More engineering work is needed, Huang told reporters. However, it isn't entirely certain which engineers have the most work to do - those at Samsung or Nvidia (or a team involving both).

Asked directly about the alleged overheating and power consumption issues, Nvidia’s Huang also dismissed those reports. “There is no story there,” he said.

Korean-based SK hynix leads the pack in delivering HBM3 and HBM3e chips. The company’s production capacity for the chips is fully booked through next year, and SK hynix plans to spend $14.6 billion to build a new production complex to meet demand. 

Samsung’s investors have grown concerned that the electronics maker has yet to catch up with its smaller rival SK hynix. This may be one of the key factors that led to Samsung recently replacing the head of its semiconductor division.

Jeff Butts
Contributing Writer

Jeff Butts has been covering tech news for more than a decade, and his IT experience predates the internet. Yes, he remembers when 9600 baud was “fast.” He especially enjoys covering DIY and Maker topics, along with anything on the bleeding edge of technology.