Apple announced this week that starting with iOS 15 and other versions, the company will search for known images of child sexual abuse material (CSAM) stored in iCloud Photos. As soon as such material is discovered, these cases will be reported to the National Center for Missing and Exploited Children, a nonprofit organization that works with law enforcement agencies in the United States.
Apple's plans have raised concerns among some security researchers and other observers, as well as users, that Apple could eventually be forced by governments to add non-CSAM images to the hash list for nefarious purposes, such as suppressing political activism. expressed The prominent whistleblower Edward Snowden said:
CSAM detection system: initially limited to the USA
No matter how well-intentioned, Apple is introducing mass surveillance across the world. If they can scan for child porn today, they can scan for anything tomorrow.
The nonprofit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully designed, and narrowly defined backdoor is still a backdoor." To allay these concerns, Apple has spoken out again today. Apple's new CSAM detection system will be limited to the United States when it launches. To prevent the risk that some governments might try to abuse the system, Apple confirmed to 9to5mac and MacRumorsthat the company will consider a possible global expansion of the system after a legal assessment on a country-by-country basis. Apple did not provide a timeframe for the general expansion of the system, should such a move ever occur.
“There is no one-size-fits-all solution to the potential for abuse”
In addition, the company addressed the hypothetical possibility that a certain region in the world decides to corrupt a security organization, pointing out that the system's first layer of protection is an undisclosed threshold before a user is flagged for inappropriate images. Even if the threshold is exceeded, Apple's manual review process would serve as an additional barrier. Apple also highlighted some advocates of the system who praised the company for its efforts in the fight against child abuse. Finally, Apple acknowledges that there is no silver bullet solution to the system's potential for abuse. But the Cupertino-based company stressed that it is committed to using the system exclusively for detecting known CSAM images. (Photo by Unsplash / Carles Rabada)