본문 바로가기
bar_progress

Text Size

Close

[CES 2026] SK Hynix Unveils HBM4 16-Layer 48GB for the First Time...Showcases Next-Generation AI Solutions

Operating Customer Exhibition Booth, Showcasing HBM and More
Successor to HBM4 12-layer 36GB, Achieving Industry-Leading Speed
HBM3E 12-layer 36GB Also on Display
SOCAMM2, LPDDR6, and 321-layer QLC Featured

On January 6, SK Hynix announced that it will unveil its next-generation artificial intelligence (AI) solutions at CES 2026, the world's largest information technology (IT) and home appliance exhibition, held in Las Vegas, USA, from January 6 to 9 (local time). This includes the first public release of its sixth-generation high-bandwidth memory (HBM4) 16-layer 48GB product.


[CES 2026] SK Hynix Unveils HBM4 16-Layer 48GB for the First Time...Showcases Next-Generation AI Solutions SK Hynix CES 2026 Exhibition Products. (Clockwise from left) ▲HBM4 16Hi 48GB ▲SOCAMM2 ▲LPDDR6. SK Hynix

Until now, SK Hynix has operated both a joint exhibition booth with SK Group and a customer exhibition booth at CES. However, this year, the company has decided to focus solely on the customer exhibition booth, thereby expanding the scope of its exhibition. The booth will be set up at the Venetian Expo, one of the official venues of CES 2026. Under the theme "Creating a Sustainable Future with Innovative AI Technology," SK Hynix plans to showcase a variety of technologies and products.


Particularly drawing attention is the company's plan to exhibit, for the first time, its 16-layer 48GB HBM4 product, which has intensified competition in the memory industry. This product achieves an industry-leading speed of 11.7Gbps and is the successor to the previously released HBM4 12-layer 36GB model. SK Hynix stated, "Development is progressing smoothly in accordance with our customers' schedules."


In addition, the company will display its fifth-generation HBM (HBM3E) 12-layer 36GB product, which will be a core offering in the HBM market this year. The latest AI server GPU module from a global customer, equipped with this product, will also be on display to specifically demonstrate its role within AI systems.


Beyond HBM, SK Hynix will showcase SOCAMM2, a low-power memory module specialized for AI servers, highlighting the competitiveness of its diverse product portfolio in response to surging AI server demand. The company will also present a lineup of general-purpose memory products optimized for AI implementation, demonstrating its technological leadership across the market. Among them, the low-power DRAM 'LPDDR6' stands out, having significantly improved data processing speed and power efficiency compared to previous products to optimize on-device AI performance.


In the NAND sector, SK Hynix will introduce its 321-layer 2-terabit QLC product, optimized for ultra-high-capacity eSSD, which is experiencing soaring demand due to the expansion of AI data center construction. This product is recognized for achieving the highest level of integration in the current market. As a result, it offers greatly enhanced power efficiency and performance compared to previous-generation QLC products, making it highly advantageous in power-sensitive AI data center environments.


[CES 2026] SK Hynix Unveils HBM4 16-Layer 48GB for the First Time...Showcases Next-Generation AI Solutions CES 2026 SK Hynix Exhibition Rendering. SK Hynix

Additionally, SK Hynix will operate an "AI System Demo Zone," where visitors can observe how memory solution products for AI systems form an ecosystem and connect organically. This area will feature and demonstrate customer-tailored "Custom HBM (cHBM)" optimized for specific AI chip or system requirements, the "AiMX" accelerator card for generative AI based on PIM semiconductors offering low cost and high efficiency, "CuD" which performs computation directly in memory, "CMM-Ax" which integrates computational functions into CXL memory, and "CSD," a data-centric product capable of self-recognition, analysis, and processing of data.


In particular, reflecting the industry's keen interest in cHBM, a large exhibit will be set up to allow visitors to visually inspect its internal structure. As competition in the AI market shifts from simple performance to inference efficiency and cost optimization, SK Hynix has visualized a new design approach that integrates certain computation and control functions, previously handled by GPU or ASIC-based AI chips, directly into the HBM.


Kim Joosun, President of AI Infrastructure and Chief Marketing Officer (CMO) at SK Hynix, stated, "As the innovation driven by AI accelerates, our customers' technical demands are also evolving rapidly. We will meet their needs with differentiated memory solutions and, through close collaboration with our customers, create new value to advance the AI ecosystem."


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top