본문 바로가기
bar_progress

Text Size

Close

The Shadow of the Digital Age: The Terror of 'Deepfake' [Yojum Saram]

The Shadow of the Digital Age: The Terror of 'Deepfake' [Yojum Saram] An image of another person's face superimposed on BTS interview footage.
[Photo by YouTube screen capture]

[Asia Economy Reporter Kim Jong-hwa] Recently, the 'deepfake' technology has gained attention due to the Nth Room incident. As science and technology have advanced, this technology can seamlessly replace the face of a person appearing in a video with someone else's face. It would have been better if it had been used positively, but this technology was used to create pornography, becoming a serious social problem.


Deepfake technology began to be widely known last August when a Chinese company released an application called 'Zao' that replaces the faces of actors in movies with those of ordinary people. Zao gained tremendous popularity upon release but simultaneously sparked fierce backlash over privacy invasion.


With the advancement of science and technology, machines have become capable of learning. 'Machine learning,' which evolves by self-learning vast amounts of data provided to make accurate decisions, was further developed by developers into a more advanced stage called deep learning. The technology utilizing this deep learning is precisely deepfake.


It is a compound word of deep learning and 'fake.' It is a technology that synthesizes video and audio through the 'generative adversarial network' of deep learning. When inputting the target's facial expressions, habits, voice, etc., AI (artificial intelligence) learns by itself, enabling the creation of a person indistinguishable from reality. When computer graphics (CG) are applied, even lip movements, actions, and facial features are naturally expressed, making it difficult to distinguish whether it is real or fake.


When deepfake technology first appeared, it raised expectations in many fields such as news anchors, dramas, movies, education, and marketing. However, since its first hit was pornography, it is being misused as a 'crime' rather than a 'face-swapping game,' which is problematic.


The misuse of deepfake technology is a serious issue worldwide. Famous actresses Scarlett Johansson and Emma Watson have suffered from synthesized pornographic videos, and about 25% of Korean female celebrities have also been targeted. In the UK, there was an incident where the voice of a CEO of an energy company was synthesized to make an employee transfer about 300 million won.


It is a misconception to think this technology is only used for special effects in movies. We are now in an era where anyone can create fake videos that look real using free source codes and machine learning algorithms available online. As the social impact of deepfake misuse grows, the international community has begun to establish countermeasures.


The U.S. government is promoting media forensic research to detect deepfakes through the Defense Advanced Research Projects Agency (DARPA), and Facebook has invested 10 million dollars in deepfake video detection technology research. Google also released 3,000 sets of deepfake video databases they developed last September.


Additionally, the community Reddit shut down subreddits related to deepfake, Twitter decided to separate and exclude deepfake content from general adult content, and graphic software company Adobe introduced digital watermarks that can identify the original creator and source of photos, videos, and news content in collaboration with The New York Times and Twitter.


What about Korea? Civil society groups have raised the need for laws directly addressing crimes using deepfake, and multiple amendments to the Sexual Violence Punishment Act, which directly punish deepfake pornography, have been proposed in the National Assembly.


However, the Sexual Violence Punishment Act amended by the National Assembly last month only punishes those who produce and distribute deepfake pornography but excludes those who possess it, causing controversy. During the discussions on the amendment, the remarks made by lawmakers were at a serious level.

The Shadow of the Digital Age: The Terror of 'Deepfake' [Yojum Saram] Our National Assembly remains the same. We hope the next National Assembly will be better than the current one.
[Photo by YouTube screen capture]

Jeong Jeom-sik, a member of the Future United Party, said, "Should we punish even those who enjoy such videos alone for self-satisfaction?" Kim In-gyeom, head of the Court Administration Office, said, "One can think of it as an art piece and create it," and Kim Oh-soo, Deputy Minister of Justice, said, "Young people or those growing up often do such things on their own computers," among other remarks.


They argued that punishing the production of videos for personal possession purposes is excessive. On the other hand, women's concerns that they could unknowingly become victims of sexual exploitation videos were ignored. Didn't this lack of awareness create the Nth Room incident? What if someone becomes a victim without knowing it? Just thinking about it is a horrifying reality.


Not only women but men can also become victims. If their family or acquaintances have experienced such harm, wouldn't their statements have been different? The voices of victims are consistent: "We do not understand why anyone would possess such videos. Possessing them is the same crime. Everyone should be punished."


The level of the law is important, but social filtering is also necessary. Fundamentally, portal sites and others distributing such content need to urgently develop voluntary detection and blocking technologies. If misused deepfake is left unchecked, a world where we cannot trust what we read or see will come. Isn't strong regulation needed before that happens?




© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top