UK Broadcaster Analyzes Deepfake Porn Websites
About 4,000 Global Celebrities Victimized by Pornographic Content
As artificial intelligence (AI) technology rapidly advances, the number of people harmed by deepfake (manipulated videos and images synthesized using AI) pornography is also increasing. An analysis revealed that about 4,000 celebrities worldwide, who have more facial exposure than ordinary people, have become victims of pornographic deepfakes, raising calls for strict punishment of deepfake pornography.
On the 21st (local time), Channel 4 News in the UK reported that after analyzing five deepfake websites, they found deepfake pornography involving about 4,000 celebrities, including 250 British individuals. The analyzed deepfake websites recorded a total of 100 million views over three months, and most victims were active entertainers, musicians, and YouTubers in the entertainment industry.
On January 31, the UK amended the Online Safety Act, making the sharing of non-consensual deepfake pornography illegal. However, the UK government did not criminalize the production of deepfake pornography. As a result, while there was only one deepfake pornography site online in 2016, since January last year, numerous sites have emerged, currently totaling 40 pornography sites sharing 143,733 deepfake pornographic videos.
A Google spokesperson stated that deepfake pornography causes significant distress and that they are developing additional safety measures for search to prevent deepfake harm. A spokesperson for Ofcom, the UK’s communications media regulator, argued that the Online Safety Act should be amended to quickly remove harmful content and block its distribution.
The issues related to deepfake pornography are not limited to overseas. On the 20th, allegations arose of a sex crime using deepfake technology at a middle school in Chungbuk, prompting police investigations. Several male students reportedly synthesized acquaintances’ faces to create and circulate pornographic videos, with victims including not only adult women but also minors, making the case more serious.
In South Korea, under the Sexual Violence Crime Punishment Act revised in 2020, both the production and distribution of deepfake videos are punishable by up to five years in prison or fines up to 50 million won. Viewing such videos can also be punished under the Child and Youth Sexual Protection Act and other related laws.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


