Apple addresses concerns with CSAM detection, says any expansion will occur on a per-country basis
6th August, 2021 at 1:34 pm by Sam
- Apple has addressed concerns with CSAM detection.
- Security researchers are concerned that the government could use CSAM detection for nefarious purposes.
- Apple addressed the possibility of a corrupt safety organization.
Apple announced earlier this week that starting later this year with iOS 15 and iPadOS 15 the company will be able to detect known child sexual abuse material (CSAM) images stored in iCloud photos. Apple says that this will enable them to report these instances to the National Center for Missing and Exploited Children, which is a nonprofit organization that works in collaboration with law enforcement agencies across the United States.
Obviously, this has sparked some major concerns with security researchers and other parties about the possibility that Apple could eventually be forced by the government to add non-CSAM images to the hash list. Researchers say that the government can use this to their advantage for nefarious purposes, such as suppressing political activism.
The nonprofit Electronic Frontier Foundation has also criticized Apple‘s plans, stating that "even a thoroughly documented, carefully thought out, and narrowly-scoped back door is still a back door."
Apple has addressed these concerns by providing additional commentary about its plans. Apple says that the CSAM detection system will be limited to the United States at launch, and that they are also aware of the potential for some governments to try and abuse the system. Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country bases after concluding a thorough legal evaluation.
Apple also addressed the possibility of a corrupt safety organization trying to abuse the system. Apple says that the system's first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate material. Even if the threshold is exceeded, Apple says its manual review process will serve as an additional barrier and will confirm the absence of known CSAM imagery. This means that imagery will not be sent out if the image is not found in the CSAM database.
We developed our own custom comment system, Instant Reply, to deliver a tracker-free, fast and easy experience! No sign up required.