ComponentNews

Micron Launches Industry’s Highest-Capacity 192GB SOCAMM2 for Low-Power AI Data Centers

Next-generation LPDDR5X-based module boosts power efficiency and data throughput for sustainable AI growth

Micron Technology, Inc. has unveiled the 192GB SOCAMM2, the world’s highest-capacity low-power memory module designed to redefine efficiency and performance in AI data centers. Built on Micron’s advanced 1-gamma DRAM process technology, the new module delivers over 20% improvement in power efficiency while extending Micron’s leadership in energy-conscious memory innovation.

The SOCAMM2 module, based on LPDDR5X technology, delivers 50% more capacity in the same compact footprint as its predecessor. It can reduce time to first token (TTFT) by over 80% in real-time AI inference workloads, making it a key enabler for large-scale, sustainable AI infrastructure.

“Micron continues to push the boundaries of memory innovation for sustainable AI infrastructure.”
— Raj Narasimhan, SVP & GM, Cloud Memory Business Unit, Micron Technology

“As AI workloads become more complex and demanding, data center servers must achieve increased efficiency, delivering more tokens for every watt of power,” said Raj Narasimhan, Senior Vice President and General Manager of Micron’s Cloud Memory Business Unit. “Micron’s proven leadership in low-power DRAM ensures that our SOCAMM2 modules provide the data throughput, energy efficiency, capacity, and data center-class quality essential to powering the next generation of AI servers.”

The new design enhances serviceability and scalability, optimizing performance for liquid-cooled and modular data center environments. SOCAMM2 is also two-thirds more power-efficient and one-third the size of comparable RDIMM modules, significantly reducing the physical and environmental footprint of high-performance data clusters.

Micron has been instrumental in shaping the JEDEC SOCAMM2 standard, collaborating closely with industry leaders, including NVIDIA, to advance low-power server memory adoption across AI ecosystems.

Customer samples of the 192GB SOCAMM2 are now shipping, with high-volume production aligned with major customer launch schedules, underscoring Micron’s commitment to accelerating sustainable innovation in AI-driven data centers.

Related posts

Facebook’s Plans to Merge Messaging Platforms

SME Channels

Hewlett Packard Enterprise introduces HPE Cray line of supercomputers

SME Channels

Seagate Technology to Open up Patents in Fight Against Covid-19

SME Channels

Leave a Comment

x