Intel and SoftBank collaborate on power-efficient HBM substitute for AI data centers, says report

RAM
RAM (Image credit: Shutterstock)

American chip giant Intel has partnered with Japanese tech and investment powerhouse SoftBank to build a stacked DRAM substitute for HBM. According to Nikkei Asia, the two industry behemoths set up Saimemory to build a prototype based on Intel technology and patents from Japanese academia, including the University of Tokyo. The company is targeting a completed prototype and mass production viability assessment by 2027, with an end goal of commercialization before the end of the decade.

Most AI processors use HBM or high-bandwidth memory chips, which are perfect for temporarily storing the massive amount of data that AI GPUs process. However, these ICs are complex to manufacture and are relatively expensive. Aside from that, they get hot pretty quickly and require relatively more power. The partnership aims to solve this by stacking DRAM chips and then figuring out a way to wire them more efficiently. By doing so, the stacked DRAM chip’s power consumption is halved versus a similar HBM chip.

If successful, SoftBank says that it wants to have priority for the supply of these chips. At the moment, only three companies produce the latest HBM chips: Samsung, SK hynix, and Micron. The insatiable demand for AI chips means that HBM supply can be hard to get by, so Saimemory aims to corner the market with its substitute, at least for Japanese data centers. This will also be the first time that Japan aims to become a major memory chip supplier in over 20 years. Japanese firms used to dominate the market in the 1980s, when they manufactured about 70% of the global supply. However, the rise of South Korean and Taiwanese competitors has pushed many of its memory chip manufacturers out of the market.

This won’t be the first time that a semiconductor company is experimenting with 3D stacked DRAM. Samsung has already announced plans for 3D and stacked DRAM as early as last year, while another company, NEO Semiconductor, is also working on 3D X-DRAM. However, these are focused on enlarging the capacity of each chip, with memory modules targeted to have 512GB capacity. On the other hand, Saimemory is aiming for reduced power consumption — something that data centers sorely need, especially as AI power consumption is increasing annually.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Jowi Morales
Contributing Writer

Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.

  • The Historical Fidelity
    Since HBM is itself a stacked DRAM solution, the only difference with Intel’s offering will the efficiency of routing data and power lines.
    So my question is, will this “efficient wiring” be internal to the stack only? So that the package is interchangeable with HBM on physical boards? Does this new offering retain the same timings and variables as HBM to make it compatible with current HBM memory controllers?
    There are many factors not mentioned in this article that will have significant effects on whether Intel-Bank’s offering will survive in the market.
    Reply
  • Pierce2623
    So they want to give GDDR the same treatment as modern SSDs gave flash memory, where they increase bandwidth by just accessing many more memory banks concurrently ?
    Reply
  • wwenze1
    You need somebody to manufacture it tho.

    On one hand it feels like business vapor talk

    On the other hand Intel has made memory in the past

    On the third hand Intel was also saying how good their new GPU idea will be and look how it turned out...

    Any cost-saving idea won't work if the end product still end up more expensive or too low margin
    Reply
  • usertests
    https://www.tomshardware.com/news/samsung-reveals-ddr5-7200-512gb-ram
    Here's a 2021 story about a 512 GB module using TSV stacked dies. It sounds like they want a lower power alternative to TSVs.

    If they are stacking dies, it's a stopgap before true 3D DRAM which would be analogous to V-NAND and have amazing density and higher power efficiency than TSV stacked dies (although it could be combined with that). Samsung is pursuing 32 layers in the early 2030s if I remember correctly.
    Reply
  • bit_user
    The article said:
    This will also be the first time that Japan aims to become a major memory chip supplier in over 20 years.
    You should say DRAM, when that's what you're talking about, since the term "memory" is sometimes also used to include NAND flash. Japan never left the NAND market.

    The article said:
    This won’t be the first time that a semiconductor company is experimenting with 3D stacked DRAM.
    Need to distinguish between stacking and 3D DRAM. The former involves die stacking and (apart from HBM) has been around since at least DDR4 (found in high-capacity server DIMMs) and LPDDR5 and might or might not come along with any performance benefits.

    3D DRAM refers to layering multiple planes of DRAM cells within a single die, as the NAND industry has done for flash memory, since about a decade ago.
    Reply
  • bit_user
    I had an idea about how this might work, BTW. What if Intel's idea is to use hybrid bonding to pair dies? If so, maybe they can combine DRAM control circuitry across a pair of dies, which would align neatly with their 2x efficiency figure. You'd still need TSVs as a communication channel through the entire stack, but the electrical load the memory controller would have to drive could potentially be halved.
    Reply