Micron’s 192GB SOCAMM2, built on 1-gamma DRAM technology, improves power efficiency by over 20%, reduces AI inference TTFT by 80%, and offers modular design for serviceability and future capacity expansion in large-scale data centers
Micron Technology has announced the customer sampling of its new 192GB SOCAMM2 (Small Outline Compression Attached Memory Module), designed to drive energy efficiency and performance in next-generation AI data centers. The launch marks a major advancement in Micron’s low-power memory portfolio, extending the capabilities of its first-to-market LPDRAM SOCAMM solution with 50% higher capacity in the same compact form factor.
Enhancing performance and energy efficiency
The 192GB SOCAMM2 leverages Micron’s cutting-edge 1-gamma DRAM process technology, delivering over 20% improvement in power efficiency and helping optimize power design across large data center clusters. The company said the added capacity could reduce time to first token (TTFT) in real-time AI inference workloads by more than 80%, a critical performance gain for large-scale AI models.
Micron’s modular SOCAMM2 design also enhances serviceability and enables easier future capacity expansion. In large AI installations, where racks can include over 40 terabytes of CPU-attached DRAM, these improvements can translate into substantial power and cost savings.
Strengthening AI collaboration and industry standards
The new memory solution builds upon Micron’s five-year collaboration with NVIDIA, advancing the use of low-power LPDDR5X server memory for AI systems. SOCAMM2 combines high bandwidth and exceptional power efficiency, setting a new benchmark for AI training and inference workloads.
“As AI workloads become more complex, data center servers must deliver more tokens per watt,” said Raj Narasimhan, Senior Vice President and General Manager of Micron’s Cloud Memory Business Unit. “Our SOCAMM2 modules deliver the throughput, energy efficiency, and reliability required for the next generation of AI data center servers.”
Micron has also contributed to JEDEC’s SOCAMM2 specification, collaborating with industry partners to accelerate low-power adoption across AI data centers. The company confirmed that SOCAMM2 samples—available in up to 192GB capacity and 9.6 Gbps speeds—are now shipping, with mass production scheduled to align with customer launch timelines.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



