Nvidia Reportedly Interested in Using SK Hynix HBM3E Memory

SK Hynix
(Image credit: SK Hynix)

Nvidia is reportedly interested in evaluating SK Hynix's HBM3E samples, according to "industry sources" via a DigiTimes report. If the information is accurate, then Nvidia's next-generation compute GPU for artificial intelligence and high-performance computing applications could use HBM3E memory instead of HBM3.

According to the industry sources cited by Korea's Money Today and Seoul Economic Daily, Nvidia has requested samples of HBM3E from SK Hynix with a view to evaluating their impact on GPU performance.

SK Hynix's upcoming HBM3E memory will increase the data transfer rate from the current 6.40 GT/s to 8.0 GT/s. This enhancement will consequently elevate the per-stack bandwidth from 819.2 GB/s to a whopping 1 TB/s. However, there are uncertainties surrounding the HBM3E's compatibility with pre-existing HBM3 controllers and interfaces, as SK Hynix has not yet disclosed information on this aspect of the new technology. In any case, Nvidia and other developers of compute AI and HPC GPUs will need to evaluate technology.

SK Hynix seemingly intends to initiate sampling of its HBM3E memory in the latter half of 2023, with plans to begin large-scale production in late 2023 or 2024. SK Hynix plans to build HBM3E memory using its 1b nanometer fabrication process, which is the company's 5th generation 10nm-class node for DRAMs. This same fabrication process is currently used to produce DDR5-6400 DRAMs. The same technology will be used in production of LPDDR5T memory chips for high-performance, low-power applications. 

It remains to be seen which of Nvidia's compute GPUs will use HBM3E memory, though it is likely that the company will use the new type of memory for its next generation of processors due in 2024. Meanwhile, we do not know whether this will be a revamped Hopper GH100 compute GPU or something brand new.

SK Hynix currently controls over 50% of HBM memory market and is the only company to supply HBM3. It will also be exclusive maker of HBM3, at least initially.

Yole Development, a market research company, has projected a significant expansion of the HBM memory market as it has unique bandwidth advantage over other types of DRAM. The firm estimates that the market, valued at $705 million in 2023, will nearly double to reach a worth of $1.324 billion by the year 2027. 

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • Meanwhile, we do not know whether this will be a revamped Hopper GH100 compute GPU or something brand new.

    Although, the maximum potential of the HBM3 DRAM has yet to be realized but I think NVIDIA may entirely skip it in favor of HBM3E for the Hopper-Next GPUs which are expected to be codenamed "Blackwell", IMO. Blackwell will use a completely new architecture.

    BTW, if suppose HBM3 supplies become an issue, then NVIDIA can also choose to go with Samsung which is also preparing its own "Snowbolt" HBM3P memory that would offer up to 5 TB/s bandwidth per stack.

    Samsung's roadmap also shows HBM3P with PIM (Programming In Memory) by 2025 and HBM4 by 2026.

    https://zdnet.co.kr/view/?no=20230503112355
    Reply