Abuse of 'Deepfake' Technology... Created Using Acquaintance's Photos
Number of Sexual Video Corrections Quadrupled in the Last 2 Years
Difficult for Victims to Detect... "Need to Strengthen Cooperation with Foreign Countries"
Two men who produced and distributed illegal pornographic materials using the faces of junior students from Seoul National University were arrested by the police. They synthesized photos of acquaintances posted on social networking services (SNS) to create illegal pornographic content as if the acquaintances themselves appeared in them. These produced pornographic materials were shared with over 1,000 people.
The crime involved the use of Telegram and is referred to as the so-called 'Seoul National University Nth Room.' What distinguishes this case from previous Nth Room incidents is that it is a new type of digital sex crime utilizing deepfake technology. Deepfake is a portmanteau of 'deep learning' and 'fake,' referring to AI-based human image synthesis technology. It is used not only for sex crimes that synthesize acquaintances' photos into pornographic materials but also as a tool to manipulate the speech and actions of politicians and other celebrities and spread false information.
'Self or Acquaintance Reports' Remain in the 10% Range
Deepfake crimes have only recently begun to attract social attention. It has been less than five years since related punishment laws were introduced. In 2020, as deepfake videos surged as a new type of digital crime, related punishment laws were enacted. According to Article 14-2 of the Act on Special Cases Concerning the Punishment, etc. of Sexual Crimes (Distribution of False Videos, etc.), anyone who produces pornographic materials that may cause sexual shame against the will of the subject for the purpose of distribution shall be punished by imprisonment for up to five years or a fine of up to 50 million won.
Nevertheless, deepfake sexual videos remain difficult to report and punish. Since all kinds of photos uploaded online on SNS and portal sites can be used in crimes, victims may become unaware of their victimization. In the recently revealed Seoul National University case, photos posted on SNS and messenger profile pictures were used in the crime without the victims’ knowledge.
The production methods are becoming increasingly simple. Anyone can easily produce videos or photos by paying about 50,000 won as a commission. Recently, mobile applications (apps) that quickly create deepfake visual materials have also emerged. Shin Jin-hee, a lawyer specializing in sex crimes, said, "As technology advances, the methods of producing deepfake sexual videos are becoming easier and more diverse," adding, "They are easily accessible to anyone, making them highly likely to be misused for crimes."
Despite the surge in deepfake crimes, it is rare for victims or acquaintances to discover the videos. According to data received by Rep. Go Min-jung of the Democratic Party from the Korea Communications Standards Commission (KCSC), the number of 'requests for correction of sexual videos' increased nearly fourfold from 1,913 cases in 2021 to 7,187 cases in 2023. However, during the same period, only 1,874 cases, or 14.7% of the total, were reported through 'complaints.' Instead, 10,693 cases, or 84.3% of the total, were discovered through KCSC’s own monitoring.
A KCSC official said, "Many victims do not even realize that their face or body has been made into a deepfake video circulating online," adding, "Although we monitor 24/7 year-round, it is difficult to find and punish all pornographic materials."
Only 2% Deleted... No Authority Over Overseas Servers
Even if illegal pornographic materials are discovered and the victim recognizes the damage, the videos cannot be deleted at will. Measures regarding internet information are subject to the 'territoriality principle' (limiting the scope of law application to the country’s territory), so domestic institutions including the KCSC have no authority to request deletion of videos distributed on sites with servers overseas. Instead, in Korea, access to such videos is blocked by 'access blocking.' However, since viewing is possible by changing the IP address and the videos are not completely deleted, secondary damage can occur, revealing clear limitations.
The KCSC has an 'International Cooperation Team' under its jurisdiction that actively requests deletion measures in cooperation with overseas related agencies. Nevertheless, the deletion rate is very low. Reviewing the correction results for false sexual videos over the past four years (June 25, 2020 ? April 30, 2024), the deletion rate was only 2.3%.
Experts advise that as global awareness of illegal pornographic materials such as 'deepfake porn' increases, a consensus on the issue and continued intergovernmental cooperation are necessary. Choi Kyung-jin, a professor of law at Gachon University and president of the Personal Information Experts Association, said, "The most practical approach is to form awareness and consensus on deepfake porn through intergovernmental cooperation and induce the creation of related laws," adding, "Even countries that emphasize freedom of expression, such as the United States and Europe, have recently recognized the need to regulate illegal videos. We should make the most of this trend."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.



