본문 바로가기
bar_progress

Text Size

Close

[In-Depth Look] 'Iruda' and Artificial Intelligence Ethics

Ham Hyeri / Journalist · Cultural Critic

[In-Depth Look] 'Iruda' and Artificial Intelligence Ethics

Recently, while upgrading internet services, I 'belatedly' brought in an artificial intelligence (AI) speaker that connects with the television. Although it is incomparable to 'AlphaGo', Boston Dynamics' humanoid robot 'Atlantis', or the quadruped robot dog 'Spot', the world opened by AI is simply fascinating. Artificial intelligence is a core technology of the Fourth Industrial Revolution and is undergoing rapid development. However, as with all technological advances, AI also brings many side effects along with convenience. Ethical issues are among the most serious problems we must address in the AI era.


In this regard, the recent controversy surrounding the AI chatbot 'Iruda' offers significant implications for us in the era of the Fourth Industrial Revolution. 'Iruda', officially launched by Scatter Lab on December 22 last year, was designed as a 20-year-old female university student who likes BLACKPINK and enjoys recording small parts of daily life through photos and writing. Thanks to active marketing, it attracted attention from people in their teens and twenties who spend a lot of time alone during the COVID-19 situation and are online daily, leading to positive responses with nearly 750,000 users within about two weeks.


However, controversy arose when it became known that some users in online communities treated 'Iruda' as a sexual object and trained it accordingly. This was an abuse of the fact that this AI software is not rule-based, operating on predetermined rules, but a deep learning-based chatbot that learns from conversations with users. Iruda's inappropriate remarks containing hate speech and discrimination also became problematic.


Similar cases have occurred in the past. Microsoft's AI chatbot Tay, launched in March 2016, caused a stir by repeatedly learning and spewing profanity, racist, and sexist remarks on anonymous sites with white supremacist and anti-Muslim tendencies, including statements denying the Holocaust. Microsoft suspended Tay's operation just 16 hours after its launch.


The controversy continues. Claims have been raised that Scatter Lab illegally used and leaked collected personal information, and victims alleging data breaches are preparing a class-action lawsuit. 'Iruda' excelled in natural writing style and context-aware conversation, which was the result of using over 10 billion KakaoTalk conversations between real couples as data.


Deep learning-based AI results vary significantly depending on what data is used as raw material during development and what users it interacts with. The principle of "you reap what you sow" applies equally to AI deep learning. Since AI learns from data provided by humans but lacks the ability to filter out biases, special attention must be paid to data bias. User education is also necessary to prevent misuse of machine learning functions. Understanding the value and role of personal information use and copyright, which constitute the majority of data, must also precede.


At this year's International Consumer Electronics Show (CES), AI ethics was also a hot topic. Brad Smith, Microsoft CLO (Chief Legal Officer), emphasized, "AI seems to promise everything, but we need to establish new guardrails," adding, "Humanity must be able to control technology as a weapon." The global trend is to consider the social impact of technology amid a civilizational transition.


Artificial intelligence is advancing at an uncontrollable pace and permeating all aspects of our lives. Whether the AI world brings a brilliant future or a fearful one depends on humans. Before it is too late, developers and users must establish and strictly adhere to 'AI ethical standards.' We earnestly hope that the growing pains we are experiencing now will become a prelude to the emergence of warmer and more humane AI.




© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top