본문 바로가기
bar_progress

Text Size

Close

[Chip Talk] Is CXL the Next After HBM?... The AI Memory Market Expands

Samsung Begins Mass Production of CXL 2.0, Eyes 3.0 Launch
CXL Emerges as Solution to AI Memory Limitations
Semiconductor Companies Accelerate CXL Development

Samsung Electronics is set to begin mass production of Computer Express Link (CXL) memory, often referred to as the "next HBM (High Bandwidth Memory)." As memory bottlenecks in artificial intelligence (AI) training and inference servers intensify, CXL memory-which can boost data processing speeds while expanding system capacity-is emerging as a next-generation solution.

Samsung Accelerates CXL-Based DRAM
[Chip Talk] Is CXL the Next After HBM?... The AI Memory Market Expands Samsung Electronics CMM-D (CXL Memory Module DRAM). Screenshot from Samsung Electronics website.

According to the semiconductor industry on December 29, Samsung Electronics recently completed customer sample shipments of its CXL 2.0-based DRAM, the "CMM-D (CXL-based DRAM Memory Module)," and has now started mass production. In October last year, at the "OCP Global Summit 2025" held in San Jose, California, Samsung Electronics unveiled its next-generation AI memory roadmap, spanning from HBM to CXL. At the time, Song Taeksang, Executive Director at Samsung Electronics, stated, "In the fourth quarter of this year, we plan to launch the industry's first CMM-D product supporting the CXL 3.1 standard," and introduced the key specifications of the next-generation "CMM-D 3.0."


The CMM-D is a DRAM memory module product based on the CXL interface. Samsung Electronics was the first in the industry to develop the CMM-D in May 2021, and in May 2022, it introduced the CMM-D 2.0, which applies the CXL 2.0 standard. This product offers 128GB and 256GB capacities with a maximum bandwidth of 36GB/s.


Launched for the first time in about three years, the CMM-D 3.0 supports the more robust CXL 3.1 standard. According to Samsung Electronics, it delivers up to 1TB (terabyte) of capacity and a maximum bandwidth of 72GB/s, significantly improving both capacity and data processing speed compared to previous generations. It is expected to be highly useful in data center environments operating large-scale AI models.


"Memory Highway"... Reducing Server Expansion Burden
[Chip Talk] Is CXL the Next After HBM?... The AI Memory Market Expands

CXL is an open standard interface that enables high-performance computing devices such as CPUs (central processing units), GPUs (graphics processing units), memory, and AI accelerators to exchange data quickly and efficiently. In traditional server architectures, when memory capacity reaches its limit, additional servers must be deployed. However, with CXL, memory can be flexibly expanded as an external device.


The industry often likens CXL to a "memory highway." By allowing different computing devices to share data through a unified interface, system resources can be utilized more efficiently. This is especially beneficial in environments like AI servers, which require large-scale data processing and frequent memory access, as it can enhance performance and reduce costs.


CXL memory does not replace HBM but serves a complementary role. While HBM handles ultra-fast computation near the GPU, CXL memory is responsible for expanding system-wide capacity and improving resource utilization efficiency. This is why CXL is emerging as the "next-generation AI memory."


Commercialization Expectations Grow... But Calls for Caution Remain
[Chip Talk] Is CXL the Next After HBM?... The AI Memory Market Expands

Industry experts believe the CXL market is entering a full-fledged commercialization phase. According to market research firm Yole, the CXL market, which was only $1.7 million in 2022, is projected to surge to $2.1 billion in 2026 and approximately $16 billion in 2028.


As demand for memory expansion grows, the standardization of CXL is also accelerating. The "CXL Consortium," launched in 2019 under Intel's leadership, includes global semiconductor companies such as Samsung Electronics, SK hynix, Micron, and AMD.


SK hynix is also accelerating the development of CXL DRAM memory. The company completed customer certification for its CXL 2.0-based CMM-D 96GB product in the first half of this year and is reportedly in the process of certifying its 128GB product. Last month, at the "SK AI Summit 2025," SK hynix also presented its next-generation memory strategy encompassing HBM, CXL, and Processing-In-Memory (PIM).


Recently, the three major memory companies-Samsung Electronics, SK hynix, and Micron-completed DRAM compatibility testing with Marvell's CXL solution "Structera," which applies CXL 2.0 technology, thereby expanding cooperation with global enterprises.


However, there is still caution regarding the timing of the CXL market's full-scale emergence. This is because applying CXL technology to real-world service environments requires optimization across the entire stack, including operating systems (OS), frameworks, memory, and accelerators. Some experts predict that full-scale commercialization may not occur until after 2027.


In particular, NVIDIA's focus on connecting GPUs via NVLink and effectively building its own ecosystem is seen as a variable that could affect the Intel-led expansion of CXL. Nam Ihyun, Chief Technology Officer (CTO) of domestic fabless company FADU, recently stated at a briefing, "We are strategically adjusting our investment in CXL switch development," adding, "We will determine the timing of commercialization after observing the pace of market growth."


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top