본문 바로가기
bar_progress

Text Size

Close

Jumping Out of Bed to "Please Delete It" Requests... 820 Explicit Videos Deleted Today

“A Night-and-Day Battle Against Digital Sexual Crimes”
Inside the Digital Sexual Crime Victim Support Center
“Jumping Out of Bed at Dawn for Emergency Deletion Requests”

"A sexually exploitative video has been found on Twitter!"


Sung Hye Park, the head of the deletion team at the Digital Sexual Crime Victim Support Center (DiSeong Center), immediately jumps out of bed and turns on her computer whenever an emergency deletion request comes in, even at 2 a.m. She knows all too well how anxious digital sexual crime victims feel every second. Over the past year, Park deleted 18,760 illegal sexually exploitative videos in this way.


Jumping Out of Bed to "Please Delete It" Requests... 820 Explicit Videos Deleted Today Officials at the Digital Sexual Crime Victim Support Center of the Korean Women's Rights Promotion Agency are deleting illegal videos. Photo by Jo Yongjun

I visited the DiSeong Center where Park works on February 5. This place, which has helped 10,000 digital sexual crime victims, looks no different from any ordinary office on the surface. However, as soon as the interview began, the shout "Please stop deleting for a moment!" rang out and all work came to a halt. This was because they were deleting videos that must not be shared with anyone outside the deletion team at the DiSeong Center.


Here, they delete all kinds of sexual crime videos, including those illegally filmed with miniature cameras in restrooms or changing rooms, videos recorded with a partner that were distributed without the victim's knowledge, and videos where the victim's face was synthesized or edited into sexually offensive content.


This is the place where victims, desperate for help, plead for their videos to be erased; where a silent war is waged against those who mockingly spread these videos, daring, "Go ahead, try to delete them." This is the DiSeong Center.


Jumping Out of Bed to "Please Delete It" Requests... 820 Explicit Videos Deleted Today Officials at the Digital Sexual Crime Victim Support Center of the Korean Women's Human Rights Institute are deleting illegal videos. Photo by Jo Yongjun

Following incidents such as the "Nth Room" case in 2019 and the "Vigilante Group" case in 2024, the role of the DiSeong Center has become even more crucial. Six years have passed since the government, emphasizing the eradication of illegal videos, established the DiSeong Center under the Korea Women's Human Rights Institute, which is affiliated with the Ministry of Gender Equality and Family. Yet, illegal videos that need to be deleted continue to proliferate like poisonous mushrooms.


However, there are only 16 people assigned to deletion duties. With two additional hires this year, the team now consists of 18 members, including Park.


Every day, they monitor various illegal sites and platforms such as Telegram, deleting an average of 820 sexually explicit crime videos per day.


Although they work to protect someone's "right to be forgotten," is it truly possible to find and delete all the victim videos that spread like poisonous mushrooms every day?


Park said, "It would be nice if support were strengthened, but everyone here works with a strong sense of duty."


Jumping Out of Bed to "Please Delete It" Requests... 820 Explicit Videos Deleted Today Officials at the Korea Women's Rights Promotion Agency's Digital Sexual Crime Victim Support Center are deleting illegal videos. Photo by Jo Yongjun

First, they focus on monitoring approximately 300 adult sites to identify any videos similar to victim footage. For this, they use DNA search technology, which matches the unique value of each video. Victim videos are often distributed not only in their original forms but also as edited or synthesized versions, and this method allows them to catch even those altered videos.


Everything else is done manually. Even with technological assistance, the final step always involves a person visually confirming the content before requesting deletion from the hosting provider. Each hosting company has its own standards for deletion, so requests are not always processed immediately. Some providers refuse to delete content because the main body parts are not visible, or because there is no clear evidence to identify a specific person, making the deletion process often difficult.


For example, even though perpetrators and frequent viewers of certain videos can identify the victim just by the title, hosting companies may refuse deletion, claiming that the keywords do not specifically identify the victim. In other words, if they cannot confirm whether a video is general pornography or illegal sexually exploitative content, they will not delete it even if requested.


Jumping Out of Bed to "Please Delete It" Requests... 820 Explicit Videos Deleted Today Park Sunghye, team leader of the Digital Sexual Crime Victim Support Center at the Korea Women's Rights Promotion Agency, is being interviewed by Asia Economy. Photo by Jo Yongjun

Sometimes, they approach their work by thinking, "If I were the perpetrator," imagining which sites and under what titles the videos might have been uploaded, and conducting keyword searches accordingly. As they gain experience, they learn which sites tend to distribute certain types of videos more frequently.


International cooperation is also frequently required. About 25% of videos cannot be deleted each year, mostly because the servers are located overseas. Through international cooperation, they can efficiently delete dozens of videos at once, rather than one at a time. This is why, despite a tight budget this year, they secured two staff members dedicated to international cooperation.


Last year, the Ministry of Gender Equality and Family planned to increase the budget to expand the deletion team to 33 people and switch the center's operating hours from 8 a.m. to 10 p.m. to a 24-hour system, as part of the "Measures to Strengthen Response to Deepfake Sexual Crimes." However, the plan was scrapped after failing to pass the final hurdle in the National Assembly's Special Committee on Budget and Accounts. Currently, for emergency deletion requests that occur outside the working hours of the deletion team, the counseling staff, who work in three shifts, receive the reports and forward them to the deletion team after the videos are found.


Jumping Out of Bed to "Please Delete It" Requests... 820 Explicit Videos Deleted Today Officials at the Digital Sexual Crime Victim Support Center of the Korean Women's Human Rights Institute are deleting illegal videos. Photo by Jo Yongjun

Park said, "Although the DiSeong Center is limited to providing post-incident support, there are ways to prevent further distribution," and asked that the number '1366' be included at the end of the article. 1366 is a hotline number that the government introduced at the end of last year as part of its response measures, serving as a unified counseling channel for digital sexual crimes.


She explained, "Most people are afraid and feel burdened because they think contacting us means 'reporting' a digital sexual crime," and added, "If you suspect you are a victim of deepfake content or are being threatened with video distribution, please contact us immediately." Even if you are unsure whether to send a body video to someone during a chat, or if you hesitate because the video has not yet been distributed, she urges you to seek support. If a video matching your statement is found during monitoring, the DiSeong Center can immediately delete it to prevent further spread.


"Once a video starts spreading, there's nothing you can do. Let's focus on deletion first."


Currently, the DiSeong Center has 18 deletion staff and 15 counseling staff. In order to respond to the increasing number of digital sexual crime cases 24 hours a day, it is estimated that a total of 60 staff members are needed.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


Join us on social!

Top