Apple employees express concerns with new CSAM scanning
- Apple employees are expressing concerns with CSAM Scanning.
- Employees fear that CSAM detection could be exploited by governments.
- Reuters says information is coming from employees who are not part of the security or privacy teams.
Apple has been seeing a lot of criticism following the announcement of a new system that will scan users' photos for known CSAM (Child sexual abuse material). Most of these reports have been coming from regular iOS users, but now we’re finally getting to hear what Apple’s own employees have to say.
A new report from Reuters mentions that multiple Apple employees have recently expressed concerns about the new CSAM system on an internal Slack channel. According to some of these employees, who have asked not to be identified, they fear that the future could be exploited by governments to censor people.
According to other employees, “the volume and duration of the new debate is surprising” as they’ve seen more than 800 messages sent about CSAM detection after only being announced last week.
Apple employees have posted over 800 messages to a Slack channel created to discuss these new features and the possibility of government exploitation. These Slack threads appear to be coming from employees who are not part of Apple’s lead security and privacy teams, according to Reuters' sources.
Based on more reports, there are also a decent number of employees who have argued in favor of CSAM detection because they believe it will “crack down on illegal material” and do only what it was designed to do and nothing else.
No password required
A confirmation request will be delivered to the email address you provide. Once confirmed, your comment will be published. It's as simple as two clicks.
Your email address will not be published publicly. Additionally, we will not send you marketing emails unless you opt-in.