본문 바로가기
bar_progress

Text Size

Close

"Stop Student Nude Photos"... Schools Worldwide Battle Deepfake Pornography

Increase in Pornography Production Using Classmates' SNS Photos
From Expulsion to Police Arrests in the US
Victims Include Not Only Students but Also Teachers in Korea
Need to Revise School Rules and Related Laws Growing

Schools and communities around the world have launched a 'war against deepfake pornography.' This is because digital sex crimes involving the creation and distribution of fake nude photos using artificial intelligence (AI) technology are spreading as the use of AI becomes more widespread. In particular, there are growing concerns about the urgent need for measures to prevent the increasing number of student victims.

"Stop Student Nude Photos"... Schools Worldwide Battle Deepfake Pornography [Image source=Reuters Yonhap News]
Five Male Students Expelled in Beverly Hills for "Severe Bullying"

According to the New York Times (NYT) and others on the 13th, in February this year, Beverly Vista Middle School in Beverly Hills, an affluent area of Los Angeles, USA, decided to expel five male students who created and distributed fake nude photos of 16 female students at the school. However, despite an internal investigation by the local police, no charges or arrests were made against the perpetrators.


Deepfake is a compound word of 'deep learning' and 'fake,' meaning fabricated. It refers to images or videos manipulated using AI to make faces and other features appear real. Recently, as generative AI applications have become easier to use, students have rapidly increased the production and distribution of fake nude photos and other fake pornography using these tools.


Upon recognizing that male students had created and distributed deepfake pornography of 12- to 13-year-old female students, Beverly Vista Middle School immediately sent a message titled "The Terrible Misuse of AI" to all students, parents, and staff. The message urged taking measures to prevent students from misusing AI inappropriately. Michael Bregy, the superintendent of the district, pointed out, "This is severe bullying occurring at school, and such explicit photos are shocking and violent to the victims and their families." He added, "While students are still learning and growing and may make mistakes, taking responsibility is natural, and this crime is absolutely unacceptable."

Fake Nude Photos Distributed in Florida Lead to Police Arrests

In the United States, even when deepfake pornography creation and distribution incidents occur at schools and police investigate, it is very rare for the offending students to be arrested.


"Stop Student Nude Photos"... Schools Worldwide Battle Deepfake Pornography [Image source=EPA Yonhap News]

According to the US IT media Wired, two male students aged 13-14 in Miami, Florida, were arrested last December for creating and distributing deepfake pornography of 12- to 13-year-old female students. This case is regarded as the first instance where perpetrators of deepfake pornography creation and distribution were arrested and criminally charged. According to the obtained police investigation report, the two male students were charged with creating and distributing fake pornography without the victims' consent. This crime is classified as a third-degree felony under Florida law enacted in 2022.


Since the creation of deepfake pornography in schools is a new type of incident utilizing cutting-edge technology, many schools have struggled to quickly determine how to respond. In October last year, at Westfield Public High School, a male student created and distributed deepfake pornography using a female student's social media photos, leading the victim and her parents to demand strong action from the school. However, due to the school's delayed response, the victim's parents were not even properly provided with an official incident report.


Riana Peppercorn, a researcher at Stanford University's Internet Observatory, pointed out, "This phenomenon emerged so suddenly that many schools, unprepared, do not know what to do when such incidents occur."


As such incidents flood across the United States, the Federal Bureau of Investigation (FBI) issued a warning last month stating that creating and distributing identifiable deepfake pornography of minors using generative AI is illegal.

New Type of School Violence Using AI... Also a Headache in Korea

Incidents of creating and distributing deepfake pornography within schools are not limited to the United States.

"Stop Student Nude Photos"... Schools Worldwide Battle Deepfake Pornography [Image source=Reuters Yonhap News]

At the end of last year, CBC News reported that at a school in Winnipeg, southern Canada, deepfake pornography sexually depicting female students was distributed online, prompting the school to report the matter to the police. The fake nude photos were created by inputting publicly available photos from social media into generative AI apps. The school conducted an investigation to identify the distributor and contacted companies to delete the fake photos to prevent further spread.


The situation is similar in Europe. In September last year, in the town of Almendralejo in southern Spain, over 20 female students were shocked to receive fake nude photos featuring their faces via messenger apps. These were fake photos created by inputting victims' photos exposed on social media into generative AI apps to convert them into nude pornography. Among the victims was an 11-year-old girl. Some victims reported psychological trauma, saying they found it difficult to even leave their homes.


The victims' parents formed support groups and filed complaints with the police, who are still investigating the case. About ten identified suspects, mostly minors aged 12 to 14, distributed the fake pornographic photos through online messengers such as WhatsApp and Telegram.


In Korea, crimes involving the creation and distribution of deepfake pornography within schools are also occurring. The education sector views this as a so-called 'new type of school violence' using AI.


Recently, a case was broadcast in which a father confirmed that a pornographic image combining a female third-year middle school student's face with another woman's nude body was distributed, and the perpetrator was a male student from the same school in Busan. In Jincheon, Chungbuk Province, five male third-year middle school students were investigated by the police for creating and distributing fake nude photos using images of five female peers and two female teachers from the same school.


In Korea, when deepfake pornography incidents occur, the Special Act on the Punishment of Sexual Crimes is applied. According to this law, editing, synthesizing, or processing images or videos of a person's face, body, or voice in a way that may cause sexual desire or shame is punishable by up to five years in prison or a fine of up to 50 million won.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top