Not only has Apple introduced end-to-end encryption for iCloud Photos, it has also abandoned its controversial plans to detect known child sexual abuse material (CSAM) in iCloud Photos.
After Apple announced extensive security improvements, it has now been confirmed (via Wired) that plans to detect CSAM in iCloud Photos are no longer being pursued. Below is Apple's full statement:
After extensive consultations with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature we first made available in December 2021. We also decided not to move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data. We will continue to work with governments, child advocates, and other companies to protect young people, uphold their right to privacy, and make the internet a safer place for children and for all of us.
CSAM detection in iCloud Photos: Apple speaks out after a year of silence
In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the US with iOS 15.2 in December 2021 and has since expanded to the UK, Canada, Australia, and New Zealand, while Siri resources are also available. Apple originally said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021. But the company eventually delayed the feature due to "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has officially abandoned plans for CSAM detection altogether. (Photo by Unsplash / Sebastian Bednarek)
- iCloud Photos & more: Apple expands end-to-end encryption options
- iMessage & Apple ID: Apple shows new security features