본문 바로가기
bar_progress

Text Size

Close

AI Basic Act to Take Effect in 2 Days: "Ambiguous Scope of High-Impact AI Fails to Reflect Reality"

Industry Warns of "Double Regulation and Startup Burden"
Government Emphasizes "Promotion Over Regulation... More Than One Year Guidance Period"

The global debut of the Artificial Intelligence (AI) Basic Act is set for January 22, and while the industry agrees with its intent, concerns are being raised that it could hinder the development of the AI ecosystem.

AI Basic Act to Take Effect in 2 Days: "Ambiguous Scope of High-Impact AI Fails to Reflect Reality"


The official name of the AI Basic Act is the "Framework Act on the Promotion and Trust-Based Development of Artificial Intelligence." Passed by the National Assembly plenary session in December 2024, this law has become the foundation for a government-supported governance system aimed at AI development and advancement, as well as ensuring ethics and reliability. At the same time, it imposes obligations on businesses to ensure transparency and safety.


In terms of regulating high-impact AI, the law is similar in nature to the European Union's "AI Act." However, the AI Act subdivides AI risks into four levels (prohibited risk, high risk, limited risk, and minimal risk) and explicitly bans certain types of AI that are considered clear threats to fundamental human rights. This provides more detailed guidelines than the AI Basic Act, which defines the scope of high-impact AI more ambiguously. The EU prepared its legislation first in 2024 and, after coordinating with the industry, postponed full implementation until the end of 2027.


Within the industry, there are concerns that the AI Basic Act will hinder the growth of AI in fields closely related to daily life, such as healthcare, energy, and transportation, which are classified as high-impact AI. If regulations are imposed during a stage when companies are striving to advance AI technology, it could simply increase their burden. In particular, startups point out that they lack the personnel and resources to respond to regulatory changes. According to a survey conducted by Startup Alliance at the end of last year with 101 domestic AI startups, 98% responded that they had not established a practical response system for the AI Basic Act.


One industry insider stated, "In the case of medical AI, it is already subject to the Medical Devices Act and the Digital Medical Products Act, so the AI Basic Act amounts to double regulation. Startups, many of which focus solely on AI, will feel the burden even more."


The AI Basic Act also regulates generative AI outputs, aiming to prevent the spread of deepfakes and misinformation. This has become a major concern for the content industry, where user resistance to AI utilization remains high. When producing games or webtoons, AI is often used as an auxiliary tool, but it is difficult to specify the extent of its use, leading to situations where watermarks must be added to a large number of images.


An official from the content industry commented, "Since the value of creativity is so important in content, there is significant resistance to AI-generated works. If it is not possible to clearly distinguish, for example, 'the text was written by a human and the images were created by AI,' then it will be difficult to specify the degree of AI involvement, which could lead to confusion and controversy."


AI Basic Act to Take Effect in 2 Days: "Ambiguous Scope of High-Impact AI Fails to Reflect Reality" Visitors are viewing works from the AI artist community at the 'AI Content Festival' held at The Platz, COEX, Gangnam-gu, Seoul. Photo by Yonhap News.


Amid various concerns, companies are busy preparing for the AI Basic Act, which is now just two days away. Kakao will implement revised terms and conditions starting February 4, requiring notification and labeling in accordance with the relevant law when providing AI-generated results, including personalized content recommendations and advertising services operated by AI. Naver is also reportedly preparing its own internal guidelines.


The government plans to operate only the minimum necessary regulations in a rational and flexible manner for the time being. Violations of the AI Basic Act can result in fines of up to 30 million won, but to minimize market confusion in the early stages of implementation, a guidance period of at least one year has been set. An official from the Ministry of Science and ICT stated, "Although the focus is on regulation, the Basic Act is significant in that it provides a legal basis for supporting AI research and development and fostering professional talent for industrial advancement. For startups, we plan to operate an integrated information and support center to provide consulting."


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


Join us on social!

Top