Adult Search Term 'Password' No Title Jija
Numerous Nth Room-Like Images Flooding
Broadcasting Commission "Correction Measures, Review Requested"
Serious Regulatory Blind Spot for Overseas Operators' 'Obscene Material'
[Asia Economy Reporter Koo Chae-eun] #Office worker Kim Su-hyun (alias, 44 years old) was shocked after clicking a URL posted in a group chat. When entering 'Jemok-eopseum' (Untitled) and 'Jemok-eopseum XXX' into the Google search bar, dozens of embarrassing exposed photos and celebrity nude composite images appeared without adult verification. Many photos appeared to be sexual exploitation materials of teenagers reminiscent of the Nth Room case. Kim said, "I didn't know the portal displayed pornography like this," and sighed, "I'm scared my daughter might see it."
It has been revealed that when entering certain 'adult search terms,' obscene materials and sexual exploitation content similar to the Nth Room are openly exposed. Despite the Korea Communications Commission (KCC) and the Korea Communications Standards Commission (KCSC) demanding corrective actions on such search results, Google's deletion process has been slow and lukewarm, leading to the spread of obscene and sexual exploitation materials.
According to the industry and the KCC on the 27th, so-called 'adult search terms' expose sexual exploitation and obscene images without 'adult verification,' prompting the KCC to request corrective measures from Google. 'Jemok-eopseum' (Untitled), which has become symbolic of adult search terms, is a representative example. It originated from various internet boards where obscene materials were posted without specific titles or descriptions. Searching 'Jemok-eopseum XXX' on search portals like Google indiscriminately exposes various obscene materials without adult verification. A significant number of people in their teens and twenties know this 'code.'
A KCC official stated, "If a search term is detected to expose sexual exploitation materials unrelated to the original content, we can request corrective actions from overseas operators," adding, "We will review whether the images are related to Nth Room sexual exploitation materials, demand Google's self-regulation, and examine with the Standards Commission whether deletion actions are necessary on both sides."
The problem is that it is uncertain whether overseas operators like Google will accept this. According to the office of Park Kwang-on, a member of the Democratic Party of Korea, only 32% of digital sexual crime materials requested for deletion by the KCSC were removed by overseas internet platform operators such as Google and Twitter. In cases where the images were voluntarily posted by users and not sexual exploitation materials from digital sexual crimes, deletion often does not proceed due to concerns over 'freedom of expression' and 'censorship controversies.'
This creates a concentration and blind spot for obscene materials on overseas operators. Domestic platform sites like Naver or Daum undergo KCSC review to prevent posting high-level obscene or similar sexual exploitation images even after adult verification. However, overseas sites like Google rely on 'self-regulation' requests. Kang Sang-hyun, chairman of the KCSC, stated at the emergency briefing on the Nth Room incident held on the 25th, "There is a need for legislation that mandates overseas operators to accept corrective requests related to digital sexual crime materials and delete or block them."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


