Skip to content

3D-stacked memory technology challenged by d-Matrix promises 10-fold increase in speed and efficiency over HBM for AI inference, claiming 3DIMC as the new contender

Santa Clara-based tech firm d-Matrix aims to supplant HBM in AI inference with innovative 3DIMC, or 3D digital in-memory-compute. The company's vision is to develop a memory chip that boasts efficiency and speed 10 times greater than HBM in AI inference duties.

3D-Stacked Memory Technology Challenges High-Bandwidth Memory Dominance in AI Inference: d-Matrix...
3D-Stacked Memory Technology Challenges High-Bandwidth Memory Dominance in AI Inference: d-Matrix Promises 10-fold Speed and Efficiency Boost with 3DIMC Technology

3D-stacked memory technology challenged by d-Matrix promises 10-fold increase in speed and efficiency over HBM for AI inference, claiming 3DIMC as the new contender

In the rapidly evolving world of artificial intelligence (AI), a British startup named d-Matrix is making waves with its innovative 3DIMC technology. This memory type, purpose-built for AI inference, is poised to challenge the dominance of High-Bandwidth Memory (HBM) in the AI and high-performance computing sectors.

According to d-Matrix's CEO, Sid Sheth, AI inference now accounts for 50% of some hyperscalers' AI workloads. Sheth believes that the increasing costs, power consumption, and bandwidth limitations of traditional HBM memory systems necessitate a rethinking of memory itself for the future of AI inference.

D-Matrix's 3DIMC technology is the company's solution to this challenge. The Pavehawk 3DIMC silicon, now operational in the lab, resembles LPDDR5 memory dies with DIMC chiplets stacked on top, attached via an interposer. The DIMC logic dies in Pavehawk are optimized for matrix-vector multiplication, a common calculation used by transformer-based AI models.

The 3DIMC technology is claimed to be up to 10 times faster and run at up to 10 times greater speeds than HBM for AI inference tasks. Microsoft intends to surpass HBM memory by up to 10 times in AI inference tasks with the next-generation 3DIMC memory while using 90 percent less power.

The growth of the HBM market is projected to continue, with an annual growth rate of 30% until 2030. This growth is expected to be accompanied by rising prices to meet demand, making alternatives like d-Matrix's 3DIMC attractive to cost-conscious AI buyers.

D-Matrix is already looking ahead to its next generation, Raptor, which is claimed to outpace HBM by 10x in inference tasks while using 90% less power. The DIMC project by d-Matrix follows a pattern of positing that specific computational tasks, such as AI inference, should have hardware specifically designed to efficiently handle just that task.

Stay informed about the latest developments in the AI and high-performance computing sectors by subscribing to the Tom's Hardware Newsletter, which delivers news and in-depth reviews directly to your inbox. As the race for AI dominance continues, advancements like d-Matrix's 3DIMC technology promise to reshape the landscape and push the boundaries of what's possible.

Read also:

Latest