The escalating demand for AI accelerators in recent years has compelled suppliers to swiftly provide high-bandwidth memory solutions to accommodate faster training speeds and increased inference token throughput. In response, Intel has partnered with Japan's SoftBank and Tokyo University to launch a startup named "Saimemory". This venture is focused on developing a stacked DRAM that aims to supplant current High-Bandwidth Memory (HBM) technologies. The envisioned memory solution seeks to mitigate the challenges posed by complex HBM manufacturing processes, high costs, heat generation, and excessive power consumption, with an anticipated 50% reduction in power consumption. Saimemory intends to finalize prototype development and evaluate the viability of mass production by 2027, targeting commercialization by 2030.
