About a year ago, Apple announced three new child protection features: a system to detect known child sexual abuse material (CSAM) in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and resources for Siri on child abuse. The latter two features are now available, but Apple has remained silent on the CSAM detection feature in iCloud Photos.
Apple originally stated that CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021. However, the company eventually postponed the feature due to “feedback from customers, stakeholders, researchers, and others.” In September 2021, Apple released the following update on its child safety page:
We previously announced plans for features to help protect children from predators who use communications to recruit and exploit them and to limit the spread of child sexual abuse material. Based on feedback from customers, stakeholders, researchers and others, we have decided to take more time in the coming months to gather input and make improvements before releasing these critically important child safety features.
CSAM in iCloud Photos: Apple's plans remain unclear
In December 2021, Apple removed the above update and all references to its CSAM detection plans from its child safety page. An Apple spokesperson shared Apple told The Verge that Apple's plans for the feature have not changed. However, as far as we know, Apple has not publicly commented on the plans since then. Apple introduced the child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and expanded them to Australia, Canada, New Zealand, and the UK with iOS 15.5 and other software updates in May 2022. Apple said the CSAM detection system was "designed with user privacy in mind." The system performs an "on-device comparison against a database of known CSAM image hashes" from child protection organizations, which Apple converts into an "unreadable set of hashes that is securely stored on users' devices."
Apple's plans sparked much criticism
Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that works with law enforcement. Apple said there would be a "threshold" to ensure the chance of an account being mislabeled by the system is "less than one in a trillion," as well as a manual human review of flagged accounts. Apple's plans were criticized by a variety of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, political groups, university researchers, and even some Apple employees. Some critics argued that Apple's child safety features could create a "backdoor" into the devices that governments or law enforcement could use to monitor users. Another concern was that there could be false positives if someone intentionally added CSAM images to another person's iCloud account to flag their account. (Photo by DenPhoto / Bigstockphoto)