Apple has quietly removed all references to CSAM detection from its child protection website, suggesting that its controversial plan to detect child sexual abuse images on iPhones and iPads is in jeopardy following heavy criticism of its methods.
In August, Apple announced a number of new parental controls features, including scanning users' iCloud photo libraries for Child Sexual Abuse Material (CSAM); Communications Safety to warn children and their parents when they receive or send sexually explicit photos; and enhanced CSAM alerts in Siri and Search. Shortly after the announcement, the features were criticized by a variety of people and organizations, including security researchers, whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), universities, and many more. Most of the criticism was directed at Apple's planned on-device CSAM detection, which researchers criticized as a dangerous technology bordering on surveillance and derided as ineffective at detecting child sexual abuse images.
CSAM detection in iCloud Photos: Has Apple abandoned the plan?
Apple initially tried to clear up some misunderstandings to clear out and reassure users by releasing detailed information, FAQs, various new documents, interviews with company executives, and more to allay concerns. But despite Apple's efforts, the controversy did not subside. Apple eventually went ahead with the rollout of the communication security features for Messages, which went live earlier this week with the release of iOS 15.2. But Apple decided to delay the rollout of CSAM detection in iCloud Photos after a flood of criticism that the company clearly did not expect. Apple justified the delay with feedback from customers, stakeholders, researchers and others. The company stated:
We've decided to take more time in the coming months to gather feedback and make improvements before releasing these important child safety features.
The above statement was taken from Apple’s official parental control website added, but has now disappeared, along with all mentions of CSAM detection. Now many are wondering if Apple has completely abandoned the plan. An official statement is still pending. (Photo by manae / Bigstockphoto)