본문 바로가기
bar_progress

Text Size

Close

"Serious Invasion of Privacy"...Global Human Rights Organizations Demand Apple Abandon Surveillance System

Apple Develops iCloud Monitoring System to Detect Child Sexual Exploitation Material
Nonprofit Organization Warns "Backdoors Could Be Created... Government May Abuse for Content Censorship"

"Serious Invasion of Privacy"...Global Human Rights Organizations Demand Apple Abandon Surveillance System [Image source=Reuters Yonhap News]


[Asia Economy Reporter Kim Suhwan] More than 90 nonprofit organizations worldwide have demanded the withdrawal of Apple's trial implementation of an automatic detection system on iCloud aimed at eradicating child sexual abuse material (CSAM), arguing that the system could be exploited for surveillance and other abuses.


According to major foreign media on the 19th, over 90 policy and human rights organizations issued a statement demanding that Apple immediately abandon the introduction of the CSAM monitoring system.


In the statement, these organizations said, "The system Apple intends to introduce suppresses freedom of expression, invades privacy, and threatens the safety of countless people around the world."


The statement continued, "Apple's monitoring system could be abused to create backdoors," adding, "Governments around the world might use it to censor content for purposes other than detecting sexual exploitation material."


A representative from the U.S.-based nonprofit 'Center for Democracy and Technology (CDT),' which participated in drafting the statement, said, "We are very disappointed by Apple's plan to introduce the monitoring system," and added, "Apple has been a company that has led the way in protecting privacy."


Earlier, on the 5th, Apple announced that it had developed an automatic detection system that identifies photos of child sexual exploitation material uploaded to iCloud and notifies the nonprofit National Center for Missing and Exploited Children (NCMEC).


This detection system uses a neural matching feature called 'NeuralHash' to evaluate whether images stored on iPhones match the unique digital fingerprints (hashes) of known child exploitation images stored by NCMEC, and then notifies NCMEC accordingly.


Apple has trialed this system and announced plans to distribute it through an update to the iPhone operating system (iOS 15) within the year.


Following Apple's announcement, human rights organizations worldwide raised concerns about privacy violations, prompting the company to respond by stating, "If governments request the use of this system for surveillance, we will refuse."


Apple also explained, "The system is strictly limited to detecting child sexual exploitation material stored on iCloud, and no government can force the addition of other content beyond CSAM to the detection targets."


It is reported that human rights organizations from Brazil and India also participated in issuing this statement. Previously, Brazilian investigative authorities requested Facebook to disclose messages of specific individuals on its mobile messenger app WhatsApp for criminal investigations, but Facebook refused. Additionally, India recently passed legislation allowing the tracking of conversations on messenger applications.


Human rights organizations from Mexico, Germany, Argentina, Ghana, and Tanzania also joined in issuing the statement.


A representative from a Brazilian human rights organization involved in the statement said, "Our greatest concern is that Apple's monitoring system could be used by other companies for different purposes," adding, "This represents a serious breach of security."


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top