Interview with Ha Jung-woo, Head of Naver Cloud AI Innovation Center
Naver is participating as a technical expert in the United Nations (UN) high-level advisory body on artificial intelligence (AI). The UN established this advisory organization in the second half of last year to promote the creation of an international AI organization, raising the possibility that the voices of domestic tech companies will be reflected in future global AI policies.
On the 27th, Ha Jung-woo, head of Naver Cloud AI Innovation Center, told Asia Economy, "Naver is participating as a technical expert in the report being prepared by the UN’s AI Advisory Body."
The UN AI Advisory Body, launched in October last year, is composed of 39 experts, government officials, and scholars worldwide to address international governance issues related to AI. In South Korea, Ko Hak-soo, chairman of the Personal Information Protection Commission, was selected and is serving as an advisory member.
This body will gather opinions from stakeholders in various fields such as education, environment, and health over one year, conduct in-depth analysis, and publish a report containing recommendations on AI governance construction and the establishment of an international organization. It is known that many experts, including domestic companies like Kakao and university professors, are participating in the report preparation alongside Naver.
Since experts and stakeholders are located in various countries, meetings are held via video conference. At Naver, Ha Jung-woo directly participates. Ha joked, "The meetings are held at dawn, which is tough," and added, "I am providing practical advice on how the report should be directed from a technical perspective." He emphasized, "To ensure AI safety, various entities such as industry, government, academia, and international organizations must cooperate."
Naver’s participation in the advisory body is part of its concerns about AI safety. Ha said, "Even before the release of the large language model HyperCLOVA, we thought it was an extremely powerful double-edged sword that could be used very dangerously if mishandled," and added, "We established a separate red team for research and development to prevent dangerous use."
The AI red team is an upgraded version of the ‘red team’ that discovers and attacks vulnerabilities in organizations or systems, playing a role in improving the performance and reliability of AI systems. It was first introduced by Microsoft in 2018 and is now operated by various big tech companies such as Google and OpenAI.
Ha said, "The establishment of the in-house ‘Future AI Center’ in January is in the same context," adding, "As AI responsibility and safety issues are emerging globally, the company judged that it was necessary to systematically establish AI safety governance at the corporate level." The Future AI Center is a CEO-direct organization created to research AI safety and develop responsible AI. Ha currently serves as the head of the Future AI Center as well.
Naver is also participating in the industry-academia consortium MLCommons, which develops machine learning technology for public interest purposes. This consortium includes global companies such as Google, Microsoft, Samsung Electronics, Intel, Qualcomm, and prestigious universities like Stanford and Harvard.
Ha said, "There is an AI Safety working group within MLCommons, and we accepted their request for Naver to participate," adding, "We are currently creating benchmarks to evaluate AI-CFT (Cross Functional Team) together with various global companies."
Ha also stressed the methodology on how to define sensitive social issues and construct datasets in AI development. He said, "In South Korea, age, region, and feminism are very sensitive issues, while in the U.S., ‘race’ is sensitive. Since cultural and national values prioritize differently, AI learning may vary," and added, "For AI to become universal, it is important to form a consensus among countries, companies, and individuals that can encompass all these issues."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


