본문 바로가기
bar_progress

Text Size

Close

[Chip Talk] Memory Faces 'DeepSeek Shock'... Will Demand for Low-Performance HBM Increase?

A Key Component Supporting Large-Scale Computation
Evolving Alongside AI Chip Development
Rising Demand for Low-Performance Chips
Decreasing Demand for High-Performance HBM
SK Hynix and Samsung Electronics
Inevitable Need to Revise Market Strategies
Overall HBM Price Range
Likely to Drop Below Current Levels
High Added Value for Korean Companies
Falling Prices May Lead to Weaker Earnings

On the 1st, about ten days after Chinese artificial intelligence (AI) startup ‘DeepSeek’ unveiled its inference AI model ‘DeepSeek R1,’ developed at an unimaginably low cost, voices predicting a seismic shift in the global AI market are growing louder. Skepticism is spreading over whether it is necessary to use expensive AI chips, raising the possibility that global tech companies, which have long favored high-performance AI chips despite their high prices, may rapidly change their AI model development strategies.


[Chip Talk] Memory Faces 'DeepSeek Shock'... Will Demand for Low-Performance HBM Increase?

Along with this, the industry cautiously anticipates changes in high-bandwidth memory (HBM), which is essential for operating AI chips. HBM is a key component attached next to the graphics processing unit (GPU) in AI chips, assisting large-scale computations. It functions by widening the data transfer path to enable processing large amounts of information at once. HBM has evolved in tandem with AI chip development and is now on the verge of the arrival of the 6th generation, HBM4.


However, due to the impact of DeepSeek, if demand for low-performance, low-cost AI chips increases in the future, customers are expected to no longer insist on high-priced, high-performance HBM products. Considering that HBM is among the most expensive memory products, many companies may also seek to reduce costs in HBM. This makes it inevitable for Korean companies such as SK Hynix and Samsung Electronics, which have held a dominant position in the HBM market, to revise their strategies.


DeepSeek Using GPU Equipped with HBM3

The reason the world is astonished by DeepSeek’s AI model is that it achieved performance comparable to ChatGPT, developed by the US-based OpenAI, at a low cost. Last year, DeepSeek succeeded in developing the large language model (LLM) ‘V3’ at a cost of $5,576,000 (approximately 7.88 billion KRW), and it is reported that they did not spend much more to create the R1 this time. According to foreign media and DeepSeek’s explanation, NVIDIA lent its ‘H800’ chip, which has reduced performance compared to commercial products due to US regulations against China, at $2 per hour for two months to train the AI model. DeepSeek demonstrated firsthand that it is possible to create a high-performance AI model even with the H800. As a result, it is expected that companies in some countries, including China, will increasingly attempt AI model training using the H800. The industry also predicts that HBM demand could follow this trend. If the H800 becomes popular, HBM demand may be limited to HBM3 rather than the latest versions. Having confirmed this possibility, China may increase government-level investment and imports related to AI chip materials, parts, and equipment such as HBM. Some in our industry are paying close attention to this trend, arguing that it is time to open the door wider to the Chinese market and for Korean companies, who lead in HBM technology, to seek new opportunities. However, there are also many opposing views. An industry insider said, “It is necessary to observe the situation in the Chinese semiconductor market related to DeepSeek more closely. Considering that Chinese companies have already reached a level where they can develop HBM3 independently, it is questionable whether our companies need to buy and use HBM.”


[Chip Talk] Memory Faces 'DeepSeek Shock'... Will Demand for Low-Performance HBM Increase?

Will HBM’s ‘High Added Value’ Disappear?

If demand for low-cost AI chips expands and market requirements for HBM specifications decrease, product prices are likely to fall accordingly. When HBM products become outdated, they are sold at lower prices. SK Hynix, Samsung Electronics, and others have been accelerating development, aiming to release new products such as HBM4 this year, and older products have mainly been sold to countries where using the latest versions is difficult. However, if DeepSeek’s development process becomes known and new demand arises for older HBM products, the situation could change rapidly. Consequently, the overall price range of HBM could be readjusted lower than it is now. Prices for older products with new demand may rise, while prices for newer products with reduced demand may fall, resulting in a decrease in the average price.


A decline in HBM prices would be unfavorable for our companies. HBM is a ‘high added value’ product with a high price and profit margin among memory products, serving as a pillar that has allowed our companies to earn profits in the memory market. Despite the adverse effects of Chinese companies’ low-price competition causing a sharp drop in DRAM prices, HBM helped our companies maintain relatively good performance. However, if HBM market prices fall in the future, a deterioration in earnings will be inevitable.


On the other hand, there is also a forecast that the entire HBM market, both new and old, will become more active. The shockwave from DeepSeek is expected to free the global AI market from NVIDIA’s dominance, encouraging active AI model development and increasing the overall usage rate of HBM. On the 29th, Christoph Pueke, CEO of Dutch semiconductor equipment company ASML, which announced its Q4 (October?December) results, said about the AI market changes caused by DeepSeek, “If AI costs decrease, more applications will be needed,” adding, “We see this as an opportunity to increase demand for semiconductor chips.” He indicated that demand for components required to manufacture chips, including HBM, will strengthen, opening new paths not only for his company but also for many others.


[Chip Talk] Memory Faces 'DeepSeek Shock'... Will Demand for Low-Performance HBM Increase?
Mixed Crisis and Opportunity, Closely Watching US Moves

Foreign media and experts have generally refrained from making clear predictions about memory market trends following DeepSeek’s emergence. They agree that more accurate analysis will be possible after closely monitoring the US’s moves, including our industry. There are some doubts about the currently known AI model development process of DeepSeek, and verification is needed. For example, foreign media have suggested that DeepSeek may have used the more advanced AI chips H100 or H20 instead of the H800, and that the information about R1’s development might have been exaggerated. Investigations have also begun into whether illegal or unfair methods were used. OpenAI and Microsoft (MS), developers of ChatGPT, have launched investigations into whether DeepSeek unlawfully collected and misappropriated OpenAI’s data. According to Bloomberg and The Wall Street Journal (WSJ), the two companies have confirmed indications that Chinese institutions attempted to extract large amounts of data through a technical process called ‘distillation.’ Distillation refers to using the output of one AI model as training data to develop another model with similar functions. The US White House is also investigating the national security implications of DeepSeek’s AI model, and the US House of Representatives is reportedly discussing strengthening export controls on technologies supporting DeepSeek’s AI infrastructure.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top