
American chip giant Intel has partnered with Japanese tech and investment powerhouse SoftBank to build a stacked DRAM substitute for HBM. According to Nikkei Asia, the two industry behemoths set up Saimemory to build a prototype based on Intel technology and patents from Japanese academia, including the University of Tokyo. The company is targeting a completed prototype and mass production viability assessment by 2027, with an end goal of commercialization before the end of the decade.
Most AI processors use HBM or high-bandwidth memory chips, which are perfect for temporarily storing the massive amount of data that AI GPUs process. However, these ICs are complex to manufacture and are relatively expensive. Aside from that, they get hot pretty quickly and require relatively more power. The partnership aims to solve this by stacking DRAM chips and then figuring out a way to wire them more efficiently. By doing so, the stacked DRAM chip’s power consumption is halved versus a similar HBM chip.
If successful, SoftBank says that it wants to have priority for the supply of these chips. At the moment, only three companies produce the latest HBM chips: Samsung, SK hynix, and Micron. The insatiable demand for AI chips means that HBM supply can be hard to get by, so Saimemory aims to corner the market with its substitute, at least for Japanese data centers. This will also be the first time that Japan aims to become a major memory chip supplier in over 20 years. Japanese firms used to dominate the market in the 1980s, when they manufactured about 70% of the global supply. However, the rise of South Korean and Taiwanese competitors has pushed many of its memory chip manufacturers out of the market.
This won’t be the first time that a semiconductor company is experimenting with 3D stacked DRAM. Samsung has already announced plans for 3D and stacked DRAM as early as last year, while another company, NEO Semiconductor, is also working on 3D X-DRAM. However, these are focused on enlarging the capacity of each chip, with memory modules targeted to have 512GB capacity. On the other hand, Saimemory is aiming for reduced power consumption — something that data centers sorely need, especially as AI power consumption is increasing annually.
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.