Samsung Begins Mass Production of CXL 2.0, Eyes 3.0 Launch
CXL Emerges as Solution to AI Memory Limitations
Semiconductor Companies Accelerate CXL Development
Samsung Electronics is set to begin mass production of Compute Express Link (CXL) memory, often referred to as the "next high-bandwidth memory (HBM)." As memory bottlenecks intensify in servers used for artificial intelligence (AI) training and inference, CXL memory is drawing attention as a next-generation solution that can both accelerate data processing speeds and expand system capacity.
Samsung Accelerates CXL-Based DRAM
According to the semiconductor industry on December 29, Samsung Electronics has recently completed customer sample shipments of its CXL 2.0-based DRAM, the "CMM-D (CXL-based DRAM memory module)," and has started mass production. In October last year, at the "OCP Global Summit 2025" held in San Jose, California, Samsung Electronics unveiled its next-generation AI memory roadmap, spanning from HBM to CXL. At that time, Song Taeksang, Executive Director at Samsung Electronics, stated, "We plan to introduce the industry's first CMM-D product supporting the CXL 3.1 standard in the fourth quarter of this year," and introduced the key specifications of the next-generation "CMM-D 3.0."
The CMM-D is a DRAM memory module product based on the CXL interface. After developing the CMM-D for the first time in the industry in May 2021, Samsung Electronics introduced the CMM-D 2.0, which applied the CXL 2.0 standard, in May 2022. This product offers capacities of 128GB and 256GB, with a maximum bandwidth of 36GB/s.
The newly introduced CMM-D 3.0, released after about three years, supports the enhanced connectivity of the CXL 3.1 standard. According to Samsung Electronics, it achieves up to 1TB of large capacity and up to 72GB/s of bandwidth, significantly improving capacity and data processing speed compared to previous generations. It is expected to be highly useful in data center environments operating large-scale AI models.
"Memory Highway"... Reducing the Burden of Server Expansion
CXL is an open standard interface that enables high-performance computing devices such as central processing units (CPUs), graphics processing units (GPUs), memory, and AI accelerators to exchange data quickly and efficiently. In traditional server architectures, when memory capacity reached its limit, additional servers had to be introduced. However, with CXL, memory can be flexibly expanded like an external device.
Within the industry, CXL is often compared to a "memory highway." By allowing different computing devices to share data through a unified interface, system resources can be used more efficiently. This is especially beneficial in environments like AI servers, where large-scale data processing and frequent memory access are required, as it can improve performance and reduce costs.
CXL memory is not intended to replace HBM, but rather to play a complementary role. While HBM handles ultra-fast computation near the GPU, CXL memory is responsible for expanding overall system capacity and improving resource utilization efficiency. This is why CXL is emerging as the "next-generation AI memory."
Growing Expectations for Commercialization... Calls for a Measured Approach Remain
Industry experts believe that the CXL market is entering a full-fledged commercialization phase. According to market research firm Yole, the CXL market, which was only $1.7 million (about 2.35 billion KRW) in 2022, is projected to grow rapidly to $2.1 billion in 2026 and approximately $16 billion in 2028.
As demand for memory expansion increases, CXL standardization is also accelerating. The "CXL Consortium," launched in 2019 under Intel's leadership, includes global semiconductor companies such as Samsung Electronics, SK hynix, Micron, and AMD.
SK hynix is also accelerating the development of CXL DRAM memory. The company completed customer certification for its CXL 2.0-based CMM-D 96GB product in the first half of this year and is currently undergoing certification procedures for its 128GB product. Last month, at the "SK AI Summit 2025," SK hynix also presented a next-generation memory strategy encompassing HBM, CXL, and Processing-In-Memory (PIM).
Samsung Electronics, SK hynix, and Micron, the three major memory companies, have recently completed DRAM compatibility testing with Marvell's CXL solution "Structura," which applies CXL 2.0 technology, thereby expanding cooperation with global companies.
However, there are still cautious views regarding the timing of the CXL market's takeoff. This is because the application of CXL technology in real service environments requires optimization across operating systems (OS), frameworks, memory, and accelerators. Some believe that full-scale commercialization may not occur until after 2027.
In particular, NVIDIA's focus on connecting GPUs via NVLink and building its own ecosystem is seen as a variable affecting the Intel-led expansion of CXL. Nam Ihyeon, Chief Technology Officer (CTO) of the domestic fabless (semiconductor design specialist) company FADU, stated at a recent meeting, "We are strategically adjusting our investment in CXL switch development," and added, "We will determine the timing of productization after observing the pace of market growth."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.
![[Chip Talk] Is CXL the Next After HBM?... The AI Memory Market Expands](https://cphoto.asiae.co.kr/listimglink/1/2025122521002269786_1766664022.jpg)
![[Chip Talk] Is CXL the Next After HBM?... The AI Memory Market Expands](https://cphoto.asiae.co.kr/listimglink/1/2025122415533569309_1766559215.png)
![[Chip Talk] Is CXL the Next After HBM?... The AI Memory Market Expands](https://cphoto.asiae.co.kr/listimglink/1/2025122415533669310_1766559217.png)

