HBM4 12-Layer Samples Provided to Nvidia and Others
Mass Production Targeted for Completion Within the Year
Highest Capacity Achieved for 12-Layer Configuration
Bandwidth Exceeds 2TB per Second
SK Hynix, a leader in high-bandwidth memory (HBM), is taking the lead once again with the 6th generation HBM4. It has opened the door to the HBM4 era by being the first in the world to provide 12-layer HBM4 samples to customers, surpassing competitors.
On the 19th, SK Hynix announced, "We have provided 12-layer HBM4 samples, a new ultra-high-performance DRAM product for artificial intelligence (AI), to major customers (including Nvidia) for the first time in the world." The company explained, "Based on our technological competitiveness and production experience leading the HBM market, we are shipping 12-layer HBM4 samples earlier than initially planned and starting the certification process with customers," adding, "We will also complete mass production preparations within the second half of the year to solidify our position in the next-generation AI memory market." SK Hynix plans to exhibit a model of the 12-layer HBM4 product at Nvidia's AI conference 'GTC 2025,' held in San Jose, USA, until the 21st.
According to SK Hynix, the 12-layer HBM4 product provided as a sample this time achieves world-class speeds required for AI memory. It also boasts the highest capacity in the world for a 12-layer configuration. For the first time, it has realized a bandwidth capable of processing over 2TB (terabytes) of data per second. This is equivalent to processing data for more than 400 Full-HD (5GB capacity) movies in just one second, making it over 60% faster than the previous generation (HBM3E). In HBM products, bandwidth refers to the total data capacity that one HBM package can process per second.
Additionally, the company applied the 'Advanced MR-MUF process,' proven competitive through previous generations, to achieve the highest capacity of 36GB for the 12-layer HBM. This process controls chip warpage and enhances heat dissipation performance, maximizing product stability. Since starting with HBM3 in 2022, SK Hynix has continued to lead the AI memory market by timely developing and supplying HBM products, including being the first in the industry to mass-produce 8-layer and 12-layer HBM3E in 2024.
Kim Ju-seon, President (CMO) of SK Hynix AI Infrastructure, said, "We have established ourselves as a leader in AI ecosystem innovation by continuously overcoming technological limits to meet customer demands. Based on the industry's largest HBM supply experience, we will smoothly proceed with performance verification and mass production preparations going forward.”
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.



