
American chip behemoth Intel hooks up with Japanese tech and investment powerhouse SoftBank; their grand objective: to build a new, highly potent alternative for High-Bandwidth Memory (HBM), which has been driving a majority of AI processors so far. This initiative takes form under a brand new joint venture named Saimemory, and it's definitely attracting attention.
Saimemory, according to reports from Nikkei Asia, will be borrowing heavily from Intel's technology and innovative patents from Japanese academic institutions, including the prestigious University of Tokyo, toward this ambitious plan: to have a working prototype and a solid plan for mass production ready by 2027. The ultimate goal? To put this fray of memory on the market before the decade closes.
You may ask: so what's the fuss about HBM? It is great because it can handle lots of data, which is precisely what AI GPUs consume. It also has that quality of speed, which makes it excellent for temporary storage. But HBM has its disadvantages:
- - It is pretty complicated to manufacture, so it is expensive.
- - It can get quite hot during operation.
- - It can be quite power-hungry.
If successful, SoftBank is purportedly expected to secure priority access to the end product, which sounds pretty reasonable with the current concept of the AI chip frenzy. At the moment, only three manufacturers – those of Samsung, SK hynix, and Micron – still make the latest HBM chips. Given the demand, procuring enough HBM can prove to be quite a challenge.
Saimemory should swoop in and have a good alternative, especially for Japanese data centers. This is also important to Japan because in a real way, it would attempt to recreate this country as a major player in the memory chip market, a position it has not held for over two decades. Back in the 1980s, Japanese companies were kings of memory, accounting for about 70 percent of the world's supply! Competition from South Korea and Taiwan has changed that, but this initiative signals that ambition is back again.
Saimemory is not even the only one looking to bring 3D stacked DRAM to the market. Samsung has already spoken of plans along those lines, while another company, called NEO Semiconductor has also been doing its own research with its 3D X-DRAM. Those, however, seem more keen on maximally increasing storage for the chips - think memory modules having a whopping 512GB!
On the one hand, Saimemory is laser-focused on cutting power consumption. This is a critical need for data centers, which are seeing increased demand for energy as workloads mounted through AI use intensify. A more power-efficient memory solution could be a game-changer.
Keep an eye on this alliance between Intel and SoftBank. If Saimemory takes off, it has the potential to forever change the marriage of AI hardware architecture, not to mention would mark a grand return for Japan in the global memory chip theater.

Intel and SoftBank Launch Saimemory Joint Venture Targeting High-Bandwidth Memory Alternatives for AI and Revitalizing Japan's Chip Industry
Intel and SoftBank partner to create Saimemory, aiming to develop a power-efficient HBM alternative using stacked DRAM technology for AI Chip