Search
Close this search box.
SK hynix Launches Mass Production of AI Server Memory for Nvidia’s Vera Rubin Platform

Seoul: SK hynix Inc. announced it has initiated mass production of a state-of-the-art memory module tailored for artificial intelligence (AI) servers, aiming to bolster its presence in the AI infrastructure sector. The company revealed that the 192GB SOCAMM2 module is built on its sixth-generation 10-nanometer-class LPDDR5X low-power DRAM technology. This module is specifically crafted for Nvidia Corp.'s Vera Rubin AI platform, marking a significant advancement in AI server memory.

According to Yonhap News Agency, SK hynix highlighted that the SOCAMM2 module adapts mobile-oriented low-power memory for server environments, positioning itself as a primary memory solution for next-generation AI servers. The product boasts more than double the bandwidth and over 75 percent improved power efficiency compared to conventional RDIMMs, making it an optimal choice for high-performance AI operations.

The South Korean chip giant stated that the new module is poised to alleviate memory bottlenecks in the training and inference of large language models (LLMs) with hundreds of billions of parameters, thereby enhancing overall system performance. Kim Joo-sun, president and head of AI infrastructure at SK hynix, emphasized, "By supplying the 192GB SOCAMM2, SK hynix has established a new standard for AI memory performance."

ADVERTISEMENT