Child's Face and Voice Replication
Becomes Target of Crime Through SNS Public Posts
The National Police Agency's National Investigation Headquarters announced on the 7th, "A phone financial fraud case occurred where a fake video was created by synthesizing a child's face using deepfake technology and sent to the parents, demanding money," urging special caution.
In October, a foreign criminal organization sent a video to the parents showing their daughter, who was traveling in Korea, being held captive and pleading for help. The perpetrator threatened, "We have kidnapped your daughter. If you want to save her, send the ransom," and the parents reported it to the consulate and the Korean police. Although no crime damage occurred this time, it demonstrated that deepfake technology can be misused for criminal purposes.
Since deepfake and deepvoice require learning from real people, videos, photos, and voices publicly available on social networking services (SNS) can become targets. As deepfake technology has advanced to the point where even experts find it difficult to judge authenticity with the naked eye, it is advisable to avoid posting content with public settings accessible to unspecified many.
The head of the Drug Organized Crime Investigation Division at the National Police Agency said, "We will produce promotional content to prevent phishing crimes that abuse artificial intelligence (AI) technology," adding, "We will work to protect our citizens by disseminating this not only domestically but also to overseas attach?s and Korean associations."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


