Galaxy S24 to Feature LLW DRAM
DRAM with Expanded Memory Bandwidth (Speed)
Reduced Latency Ideal for AI Inference Tasks
Samsung Electronics' flagship smartphone Galaxy S24, scheduled for release in the first quarter of next year, is expected to be equipped with a new type of memory called Low Latency Wide I/O (LLW I/O) DRAM.
This technology attracted industry attention a few months ago when Apple announced it would be introduced in its virtual reality (VR) device, Vision Pro. The Galaxy S24 will be the first smartphone to feature this memory, with the potential to maximize the AI inference performance of the Galaxy S24.
A Highway to Increase Memory Speed
Every computer chip has input/output devices, or I/O. These are small pins attached to the chip, through which semiconductor data is transmitted. I/O can be likened to a data highway. LLW DRAM is a memory device that significantly increases the number of I/O pins to boost the memory transfer speed of DRAM.
Apple’s Vision Pro uses SK Hynix’s LLW DRAM. This DRAM has 512 I/O pins, which is eight times more than conventional DRAM (64 pins). This means the bandwidth, which refers to the data transfer speed of the memory, has also expanded accordingly.
Breaking Through the Limits of the Hierarchical Structure
What are the benefits of increasing DRAM bandwidth? To understand this, you first need to know about the so-called "memory hierarchy." The Korean semiconductor industry is known for its strength in memory, but memory itself consists of various devices such as hard disks (HDD, SSD), DRAM, cache (L1, L2, SRAM), and registers.
Delving too deeply into the hierarchy can get complicated, so to simplify, it can be represented as a pyramid structure. As you go down, speed (i.e., bandwidth) decreases but memory capacity increases. As you go up, speed increases but capacity decreases. Cache, in particular, is memory embedded inside the CPU, so it is the fastest.
Now you can understand why DRAM is the main product among memory types. DRAM offers medium speed and medium capacity at a relatively affordable price. Therefore, it is very convenient to place it next to the CPU to enhance memory performance.
Virtual reality (VR) devices need to implement visuals that follow the movement of the human eye. Apple's Vision Pro minimizes latency issues by incorporating SK Hynix's LLW DRAM. [Image source=Apple]
The problem lies in the somewhat mediocre speed of DRAM within the hierarchy. When the CPU performs calculations and retrieves data stored in DRAM, latency inevitably occurs, leading to performance degradation. This is a critical drawback for VR devices, where fast video processing is essential, and AI tasks that require processing massive parameter data.
In server devices, to solve latency issues, a special memory called HBM is used, which stacks multiple DRAM chips and has through-silicon vias in the center. However, HBM cannot be squeezed into small electronic devices like smartphones or VR devices. The alternative that emerged is LLW DRAM. Although LLW DRAM will never be as fast as cache, it is just right for minimizing latency compared to existing products.
DRAM for AI
Next-generation smartphone chip architecture unveiled by Samsung. [Image source=Samsung Electronics 'X']
On the 30th of last month (local time), Samsung officially introduced a new SoC (System on Chip) architecture equipped with LLW DRAM on its social media channels. The image shows DRAM and LLW connected right next to the processor. LLW works alongside the CPU, and when tasks require more memory, the remaining DRAM capacity is borrowed.
This structure is ideal for AI inference tasks on smartphones. Especially for generative AI like ChatGPT, high memory bandwidth is essential. This is why NVIDIA GPUs use HBM when building AI data centers. LLW DRAM has the potential to establish itself as an HBM alternative in the smartphone industry.
Earlier, on the 8th of last month, Samsung unveiled its own generative AI model called "Gauss" at the 'Samsung AI Forum 2023' held in Korea. Gauss consists of three models that generate language, code, and images, and these functions will be processed both on Samsung’s cloud and on-device within smartphones.
If the new DRAM Samsung revealed works well, the Galaxy S24, launching next year, could literally become the vanguard of Samsung AI.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.
![Galaxy S24's Hidden Card 'Memory Highway' [Tech Talk]](https://cphoto.asiae.co.kr/listimglink/1/2023121508281389789_1702596494.jpg)
![Galaxy S24's Hidden Card 'Memory Highway' [Tech Talk]](https://cphoto.asiae.co.kr/listimglink/1/2023121508320889797_1702596727.jpg)

