'AI Core Chip' Nvidia A100 Import Blocked
Efforts to Achieve Similar Performance by Combining Older Models
According to a report by The Wall Street Journal (WSJ) on the 7th (local time), China, struggling to secure advanced semiconductors due to sanctions imposed by the Joe Biden administration, is accelerating efforts to develop cutting-edge artificial intelligence (AI) by utilizing older semiconductors.
WSJ reported, based on analysis of various research papers and interviews with multiple sources, that major Chinese tech companies such as Huawei, Baidu, and Alibaba Group are focusing on developing such technologies.
This research activity by Chinese tech companies stems from the blocked export of Nvidia’s A100 and H100 semiconductors, which are the most widely used American chips in AI. The A100, released by Nvidia in 2020, was developed for large-scale AI training and inference and is essential for training large language models (LLMs) critical to chatbots like ChatGPT. The H100, developed as the next generation, was released in March of this year.
Although OpenAI, the developer of ChatGPT, has remained silent, the global investment bank UBS once estimated that training large AI models like ChatGPT requires approximately 5,000 to 10,000 A100 units.
Due to semiconductor regulations imposed by the Biden administration in October last year, China is currently unable to import A100 and H100 chips. Consequently, Nvidia supplies older versions with lower performance, the A800 and H800, to the Chinese market. In response, within China, efforts are underway to limit the use of previously purchased A100 and H100 chips while researching methods to achieve similar performance by utilizing multiple A800 and H800 units.
According to a survey by the China Semiconductor Industry Association disclosed at a confidential conference, it is estimated that there are currently around 40,000 to 50,000 A100 chips within China. Chinese tech companies such as Alibaba and Baidu are reportedly taking internal measures to minimize the use of their stockpiled A100 chips. WSJ previously reported that Baidu instructed other teams, including its autonomous driving unit, to stop using A100 chips to focus on developing its AI chatbot, 'Ernie Bot.'
Instead, Chinese companies are exploring ways to utilize semiconductors that can still be imported under these circumstances. Professor Yang Yu of the National University of Singapore, who runs the AI infrastructure company HPC-AI Tech, stated that Chinese companies are researching how to combine three to four older semiconductors, including the A800 and H800, to achieve performance comparable to Nvidia’s cutting-edge chips. A representative example is the computing cluster Tencent unveiled last April, which uses Nvidia’s H800 to train large-scale AI models.
Professor Yang evaluated that this Chinese approach could be costly. If a U.S. company requires 100 H100 units to train a large language model, a Chinese company would need 3,000 or more H800 units to achieve the same result. WSJ reported that Alibaba, Baidu, Huawei, and others are researching technologies by combining older Nvidia semiconductors such as the V100 and P100 along with the A800 and H800.
Generally, combining multiple semiconductors or software technologies does not easily result in stable operation, and in the U.S., this method is regarded as a 'last resort.' However, Chinese tech companies, facing numerous restrictions, appear to be actively utilizing such combinations to solve their problems.
WSJ commented, "Although there are still challenges to overcome to catch up with U.S. AI leaders, some experiments have confirmed the potential," adding, "If successful, Chinese tech companies could not only resolve issues caused by U.S. sanctions but also respond more flexibly to future restrictions."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.



