Over 10,000 Digital Sex Crime Victims Recorded for the First Time Last Year
Sharp Rise in Deepfake-Related Damages and Video Deletion Requests
Support Center Faces Growing Workload Amid Budget Constraints
"If you don't do as I say, I will create and distribute naked photos of you."
Digital sex crimes, such as threatening to exploit and distribute videos that cause sexual shame or creating 'fake videos' using deepfake technology and uploading them online, are increasing significantly.
Last year, the number of people who suffered digital sex crime damage in this way exceeded 10,000 for the first time. This is the largest scale since the government began counting digital sex crime victims in 2018.
In particular, when looking at the types of damage, harm caused by synthetic materials such as deepfakes has surged 20 times in six years, drawing attention, compared to illegal recordings of the body without consent.
According to the Ministry of Gender Equality and Family and the Korea Women's Human Rights Institute on the 11th, as of the end of December 2024, the number of victims supported by the 'Digital Sex Crime Victim Support Center (DiSeong Center)' was 10,305, an increase of 14.7% from the previous year (8,983). The number of victims exceeding 10,000 is the first since the Ministry of Gender Equality and Family established the DiSeong Center in April 2018 to prevent the spread of digital sex crimes. The number has increased ninefold in six years from 1,315 at the time of opening.
The DiSeong Center conducts comprehensive activities such as counseling for digital sex crime victims, support for deleting harmful videos, and cooperation with investigations. Among these, deleting various sexual exploitation materials, including deepfake illegal recordings, is a major task. It not only deletes reported videos but also proactively removes child and adolescent sexual exploitation materials through monitoring. When illegal videos are distributed, the center also provides investigative, legal, and medical support in cooperation with victim support organizations.
The number of counseling cases supported last year increased nearly sixfold from 4,787 in 2018 to 28,173 last year, and the number of harmful videos deleted from adult sites and social networking services (SNS) increased more than tenfold from 28,879 to 300,237 during the same period.
This means that on average, 822.5 illegal videos were deleted daily last year for someone's 'right to be forgotten.' The number of deletions continues to increase every year due to the continuous production of sexual crime victim videos despite deletions. The total cumulative number of deleted illegal videos so far exceeds 1.2 million (1,211,797).
When sexual crime victim videos are distributed, deletion requests are made to each platform, and evidence collection is supported for investigative agencies. Last year, the number of cases supported in cooperation with investigations and legal assistance was 3,826, about 19 times higher than six years ago (203 cases). This is interpreted as a significant increase due to the operation of hotlines between the center, prosecution offices, and police agencies. In particular, the center deletes videos linked to group cases with five or more victims received from investigative agencies, and as related crimes have recently increased, cooperation with related agencies has become active.
A notable recent point is the rapid increase in digital sex crime damage caused by synthesis and editing.
The most frequently reported types of damage by victims (multiple selections allowed) are 'fear of distribution' (25.9%) and 'illegal filming' (24.9%), but the rapidly increasing type is 'synthesis/editing.' When approached by the type of video creation, illegal filming damage cases increased sixfold in six years, while synthesis/editing damage cases surged 20 times (from 69 to 1,384).
Synthesis/editing damage accounted for 8.2% of the total proportion, but due to changes in the digital environment, the spread of deepfake technology, and the emergence of generative AI, future damage is expected to increase in more diverse forms.
In this situation, the fastest and most immediate way to help digital sex crime victims is 'video deletion.' Although the number of videos that need to be deleted increases every year as deepfake technology advances, manpower recruitment falls short of expectations.
Last year, the Ministry of Gender Equality and Family increased the budget for the DiSeong Center to reinforce personnel for deleting deepfake sexual exploitation materials and announced that the center would operate 24 hours. Accordingly, the National Assembly's Gender Equality and Family Committee approved an increase of 4.7 billion KRW over the government's proposal (3.2 billion KRW) for the DiSeong Center budget, but it failed to pass the final National Assembly Budget and Accounts Committee. Instead, the center must operate this year with 3.2 billion KRW, which is 200 million KRW less than last year (3.4 billion KRW).
The digital sex crime videos deleted last year by the DiSeong Center deletion personnel (16 people) amounted to 18,760 cases per person.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.
![[Exclusive] Shadows of the AI Era... 'Deepfake' Sexual Crimes Increase 20-Fold in 6 Years](https://cphoto.asiae.co.kr/listimglink/1/2025020716444124468_1738914281.jpg)
![[Exclusive] Shadows of the AI Era... 'Deepfake' Sexual Crimes Increase 20-Fold in 6 Years](https://cphoto.asiae.co.kr/listimglink/1/2025020716444024467_1738914281.jpg)
![[Exclusive] Shadows of the AI Era... 'Deepfake' Sexual Crimes Increase 20-Fold in 6 Years](https://cphoto.asiae.co.kr/listimglink/1/2025021016061826499_1739171178.jpg)
![[Exclusive] Shadows of the AI Era... 'Deepfake' Sexual Crimes Increase 20-Fold in 6 Years](https://cphoto.asiae.co.kr/listimglink/1/2025021016065526501_1739171214.jpg)

