Apple has released an FAQ titled “Expanded Protections for Children” aimed at allaying users’ privacy concerns about the new CSAM detection in iCloud Photos and communication security for iMessage that the company announced last week.
Apple's new tools designed to protect children have drawn mixed reactions from security and privacy experts, with some falsely claiming that Apple is abandoning its stance on privacy - which is definitely not true. Now, Apple has published a rebuttal in the form of a FAQ (Frequently Asked Questions) document.
At Apple, our goal is to build technology that empowers people and enriches their lives—while helping them stay safe. We want to protect children from predators who use communication tools to recruit and exploit them and limit the spread of child sexual abuse material (CSAM). Since we announced these features, many people, including privacy and child protection organizations, have expressed support for this new solution, and some have reached out with questions.
What are the differences between communication security in iMessage and CSAM detection in iCloud Photos?
Essentially, these two features work completely independently of each other. Nor do they use the same techniques. Apple emphasizes that the new features in iMessage "are designed to give parents additional tools to protect their children." Images sent or received via iMessage are analyzed on the device "and therefore [the feature] does not change the privacy guarantees of iMessage." Regarding CSAM detection, the company explains:
CSAM detection in iCloud Photos does not send information to Apple about photos other than those that match known CSAM images.
iCloud photo: Apple will reject government demands
Much of the document is consistent with what we provided for you in a large summary just yesterday - you can find it attached below. Interestingly, however, there is one particularly controversial point that is mentioned explicitly, which was not available in this form before but was only communicated in passing. Apple is addressing the concerns of privacy and security experts that this scanning of images on the device could easily be expanded to the benefit of authoritarian governments that demand that Apple expand the search.
Apple will reject such requests. We have faced requests before to develop and deploy government-mandated changes that compromise user privacy, and we have steadfastly rejected those requests. We will continue to reject them in the future. Let's be clear: this technology is limited to detecting CSAM stored in iCloud, and we will not comply with any government request to expand it.
If you want to see Apple’s FAQ, you can find the document here. Below you will also find our recently published article, which on the one hand looks at the technologies behind it and on the other hand answers the most important questions. (Image: Apple)