본문 바로가기
bar_progress

Text Size

Close

[?혐오사회] Emergency Response Order Issued at Dawn on the 30th... The Hectic 12 Hours of Naver and Kakao

Emergency Response of Harmful Content Management Team, Blocking Followed by Objection
Blocking Exposure of Reported Posts... Service Restrictions Imposed on Authors

Editor's NoteWith 307 casualties and 155 deaths, the entire nation was deeply shocked by the 'Itaewon Disaster.' Amid this, the long-standing issue of 'hate' is stirring once again. Its forms and types are diverse. As the government announced support measures for the victims, reactions ranged from "Why should I pay taxes to support those who went out to dance and have fun and then got into an accident?" to baseless suspicions such as "Groups of Chinese people pushed others," fueling racial and gender divisions. In hateful posts, there are neither perpetrators nor victims. Perpetrators become victims by other hate perpetrators and spread like an epidemic in an instant. Ultimately, the only way to stop this is through our society's self-purification efforts. Looking back on the hectic two days following the disaster, we examined how to move away from a society of hate and where we should head.


"A crush accident occurred in Itaewon during a Halloween party. Please strengthen monitoring as sensational videos, photos, malicious rumors, and hateful remarks are expected."


Emergency Response Order Issued at Dawn on the 30th

[Asia Economy Reporter Choi Yuri] Around midnight on the 29th, as news about the Itaewon crush accident began to spread one by one, the Naver User Protection Team and Kakao Monitoring Task Force, responsible for managing harmful posts, issued an emergency response order. They review and take action on posts reported 24/7 throughout the year, but in cases like this where the social impact is significant and secondary damage is a concern, they initiate emergency response.


In the tense situation continuing into the early hours of the 30th, unblurred or un-mosaicked photos and videos from the scene, as well as unverified facts related to the accident, rapidly spread online, making the staff monitoring the platforms even busier. Most of the images that AI can filter out are extremely sensational or equivalent to pornography. Normally, many users would challenge harmful post authors, but since it was early morning, the number of reports was low, so they had to block videos one by one and obscure them so users could not see them.


It was also impossible to completely block the spread of unverified facts related to the accident. Broadcasters continued to air phone interviews with citizens at the scene without filtering. Users unfamiliar with the local situation poured harsh criticism toward the victims, which was followed by rebuttals, leading to hateful and hateful comments that incited generational and gender conflicts, escalating the situation beyond control.


[?혐오사회] Emergency Response Order Issued at Dawn on the 30th... The Hectic 12 Hours of Naver and Kakao

Emergency Notice on the Itaewon Incident at Noon on the 30th, Intensive Blocking of Harmful Posts

At noon on the 30th, 12 hours after the emergency response order was issued for the Itaewon incident, Naver and Kakao posted an urgent notice regarding related posts. They recommended refraining from posting photos and videos from the disaster site as much as possible and stopping the spread of false information. Afterwards, they processed the flood of reports and mobilized related departments such as search, planning, and business teams to find and block harmful posts. Sensational photos and videos gradually disappeared, and comments containing hate and hatred were obscured.


Naver's dedicated monitoring team and emergency reporting center handle responses to harmful posts. To minimize damage caused by harmful posts, when a report is received, they monitor and block the post and later allow the author to appeal. For obscene posts, they apply AI technology called 'X-eye' that detects images in real time to prevent exposure. Since the Itaewon disaster, Naver has strengthened post monitoring. Since the 30th, they have been guiding users through community notices on cafes, blogs, etc., to handle content violating terms through reporting procedures.

[?혐오사회] Emergency Response Order Issued at Dawn on the 30th... The Hectic 12 Hours of Naver and Kakao On the 1st, during the national mourning period following the large-scale crush disaster in Itaewon, police forces are controlling access at the Itaewon disaster site in Yongsan-gu, Seoul. Photo by Kang Jin-hyung aymsdream@

Kakao Also Blocks Exposure in 'Open Chat'... User IDs Suspended Upon Accumulated Reports

Kakao is responding mainly through its User Protection Team. Normally, they respond after receiving reports via phone or online, but when a major issue like this occurs, they initiate active monitoring. Reported posts are blocked from exposure. Not only specific posts or comments but also KakaoTalk Open Chats that indiscriminately share problematic content are excluded from search results and subject to measures. For comments, even without reports, if they violate operational policies, they are analyzed by AI technology called 'SafeBot' and blind-processed.


When reports accumulate, measures are also taken against the users who posted the content. On KakaoTalk, temporary or permanent restrictions can be applied, ranging from limiting search and exposure in friend lists to restricting message sending, chat room usage, and overall service use. On Daum, user IDs can be temporarily or permanently suspended.


A Kakao official said, "The User Protection Team and related departments have been on emergency response since the weekend," adding, "While strengthening monitoring, we are collaborating with external organizations to create content regulation guidelines and user protection policies."


Major SNS platforms such as Facebook and Twitter have also taken action. Twitter uses machine learning to identify and remove content violating operational policies. They operate a Twitter Rules Enforcement Team that monitors reported malicious posts. Facebook also takes measures through its own guidelines and abuse report channels.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top