본문 바로가기
bar_progress

Text Size

Close

[AI Revolution](25) 'HBM-PIM' Leading AI Future... Samsung Accelerates Memory Innovation

Introducing Innovative Memory Supporting AI Data Processing
HBM-PIM Enhances Performance and Reduces Power Consumption
Active External Collaboration for Next-Generation Technology Development

Next-generation memory semiconductors are the key to accelerating the era of artificial intelligence (AI). Samsung Electronics, the number one company in the memory industry, is striving for technological innovation by introducing memory products with computing functions in response to this trend. Instead of relying on existing technological paradigms, the company plans to open a new era of memory by addressing AI demand with fresh ideas.


Last month, Samsung Electronics participated in MemCon 2023 held in California, USA, where it unveiled several next-generation memory products. MemCon is a conference focused on exploring AI-related memory solutions and was held for the first time this year.


[AI Revolution](25) 'HBM-PIM' Leading AI Future... Samsung Accelerates Memory Innovation Example image of HBM-PIM used with Graphics Processing Units (GPU) for large-scale artificial intelligence (AI) support /
[Image source=Samsung Electronics Semiconductor Newsroom]

At this event, Samsung Electronics showcased ▲ High Bandwidth Memory (HBM)-Processing In Memory (PIM), ▲ Compute Express Link (CXL) DRAM, ▲ Processing Near Memory (PNM), and ▲ next-generation Solid State Drives (SSD), all suitable for AI training and computing that require large-scale data processing.


Samsung Electronics also introduced these products at memory-related events held last October (Samsung Tech Day 2022) and August (Flash Memory Summit), demonstrating its commitment to developing next-generation products and securing the related market. As AI utilization grows, the use of high-performance and high-capacity memory becomes essential.


Currently, computers are designed based on the "Von Neumann architecture." The central processing unit (CPU) and memory are separate, with the CPU fetching instructions from memory, executing them, and then storing the results back in memory. As a result, the more data exchanged between the CPU and memory, the slower the processing becomes, causing what is commonly known as a bottleneck.

[AI Revolution](25) 'HBM-PIM' Leading AI Future... Samsung Accelerates Memory Innovation

AI learns and computes based on vast amounts of data. As AI technology advances, data bottlenecks inevitably increase, making innovation in memory technology essential. To address this, Samsung Electronics has introduced several next-generation memory products in recent years that overturn traditional memory paradigms.


A representative achievement is the world's first HBM-PIM introduced in 2021. HBM-PIM is memory that reduces data movement by performing some of the computations traditionally handled by the CPU directly within the memory. Simply put, it is a new model of HBM equipped with an AI engine. HBM stacks multiple DRAM chips vertically to increase data transfer speed, and HBM-PIM offers higher performance and lower power consumption compared to standard HBM.


In October last year, Samsung Electronics conducted performance tests of HBM-PIM with GPU manufacturer AMD. By equipping AMD's commercial GPU (MI-100) accelerator card with HBM-PIM, they confirmed that accelerator performance doubled while energy consumption was reduced by 50% compared to before the upgrade. Various foundational tasks are currently underway to commercialize HBM-PIM.


In this process, progress is also being made in HBM development. HBM has steadily evolved through generations such as 1st generation (HBM) and 2nd generation (HBM2). Samsung Electronics is showcasing 16-gigabyte (GB) and 24GB HBM3 samples to the market and is fully prepared for mass production. A next-generation HBM3P, which improves performance and capacity over HBM3, is also scheduled for release in the second half of the year.

[AI Revolution](25) 'HBM-PIM' Leading AI Future... Samsung Accelerates Memory Innovation

Samsung Electronics is also steadily introducing memory products utilizing CXL technology. CXL is a next-generation interface used in high-performance computing systems. It adds additional connection pathways between the CPU, graphics processing unit (GPU), and memory, dramatically increasing the amount of data that can be processed.


Last year, Samsung Electronics became the first in the industry to introduce a 512GB high-capacity CXL DRAM. They also unveiled technology combining PNM, which places computing functions next to memory, with CXL. This year, they plan to offer CXL DRAM in various capacities and are developing CXL SSDs. In a conference call on the 27th, the company expressed optimism about expanding demand, stating, "Since the beginning of this year, inquiries for CXL customer samples have increased."


Samsung Electronics has recently increased external collaborations in the development of next-generation memory. In December last year, they announced plans to develop AI-dedicated memory products with Naver. Collaborating with Naver, which introduced the massive AI language model HyperCLOVA, they aim to deliver optimal semiconductor solutions to resolve memory bottlenecks. Specific collaborative tasks are currently being finalized.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top