SK hynix Begins Mass Production of 192GB SOCAMM2 'Setting a New Standard for AI Server Memory Performance'
- Mass production of 192GB high capacity products designed for the NVIDIA Vera Rubin platform
- Maximizes power efficiency by featuring high density DRAM based on the latest 1cnm process
- Company to closely collaborate with NVIDIA to solve bottlenecks in AI infrastructure and provide optimal performance
SEOUL, South Korea, April 19, 2026 /PRNewswire/ -- SK hynix Inc. (or "the company", www.skhynix.com) announced today that it has begun mass production of the 192GB SOCAMM2, a next-generation memory module standard based on the 1cnm process (sixth-generation of the 10-nanometer technology) LPDDR5X low-power DRAM.
SOCAMM2 [1] is a module that adapts low-power memory – which was previously used mainly in mobile products like smartphones – for server environments. It is designed to be a primary memory solution for next-generation AI servers.
[1]SOCAMM2 (Small Outline Compression Attached Memory Module 2): An AI server–optimized memory module based on LPDDR. It offers a slim form factor and high scalability, while its compression connector enhances signal integrity and allows for easy module replacement
SK hynix emphasized that the 1cnm based SOCAMM2 product that is now in mass production delivers more than double the bandwidth with over 75% improved power efficiency compared to conventional RDIMM [2], providing an optimized solution for high performance AI operations.
[2]RDIMM (Registered Dual In-Line Memory Module): DRAM module for server/workstation that includes a register or buffer chip to relay address and command signals between the memory controller and DRAM chip in a memory module
In particular, the company noted that its SOCAMM2 products are designed for NVIDIA Vera Rubin platform.
SK hynix expects the new SOCAMM2 product will fundamentally resolve the memory bottlenecks encountered during the training and inference of large language model (LLM) with hundreds of billions of parameters, thereby playing a pivotal role in dramatically accelerating the processing speed of the overall system.
The company stated that with the AI market shifting focus from inference to training, SOCAMM2 is gaining significant attention as a next-generation memory solution capable of operating LLMs with low power consumption. To meet the demands of its global Cloud Service Provider (CSP) customers, SK hynix has not only been providing a supply portfolio, but also stabilized its mass production system early on.
"By supplying the 192GB SOCAMM2, SK hynix has established a new standard for AI memory performance," Justin Kim, President & Head of AI Infra (CMO, Chief Marketing Officer) at SK hynix said. "We will solidify our position as the most trusted AI memory solution provider, through close collaboration with our global AI customers."
About SK hynix Inc.
SK hynix Inc., headquartered in Korea, is the world's top-tier semiconductor supplier offering Dynamic Random Access Memory chips ("DRAM") and flash memory chips ("NAND flash") for a wide range of distinguished customers globally. The Company's shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxembourg Stock Exchange. Further information about SK hynix is available at www.skhynix.com, news.skhynix.com.
SOURCE SK hynix Inc.