Epic Games CEO criticizes Apple over new CSAM safety initiatives
- Epic CEO Tim Sweeney takes it to Twitter once again to criticize Apple.
- Sweeney believes the tools are inescapably government spyware.
- However, Sweeney's accusations are not 100% accurate to how Apple's system actually works.
Epic Games CEO Tim Sweeney has attacked Apple over its new child safety initiatives, putting forward the idea of it being a way for governments to conduct surveillance.
This Thursday, Apple launched quite a few new tools concerning child sexual abuse material (CSAM) to help protect children from people who use technology to exploit them. This initiative would introduce new features to iMessage, Siri and Search, and a new system for scanning iCloud Photos for known CSAM imagery.
I've tried hard to see this from Apple's point of view. But inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to government.https://t.co/OrkfOSjvS1
— Tim Sweeney (@TimSweeneyEpic) August 7, 2021
Tim Sweeney, however, has taken part in the outpouring of criticism against Apple and recently took to Twitter once again, this time to complain about Apple's new initiative. Sweeney proclaims that Apple is potentially enabling the future surveillance of user data for governments.
I've tried hard to see this from Apple's point of view, but inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to the government.
This is entirely different from a content moderation system on a public forum or social medium. Before the operator chooses to host the data publicly, they can scan it for whatever they don't want to host. But this is people's private data.
However, Sweeney's accusations about personal data scanning are not 100% accurate to how Apple's system actually works.
Scanning is performed on-device by generating mathematical hashes of files stored on iCloud, which means it never sees the file but only the hashes unless it's flagged. These hashes which are generated from the files stored in the iCloud are then checked against known CSAM image hashes. If any of the generated hashes match with known CSAM, the National Center for Missing & Exploited Children (NCMEC) will be alerted, and your account will be flagged.
Furthermore, this scanning only applies to iCloud Photos, and images stored on a device with iCloud Photos turned off cannot be touched.
Sweeney then goes on to claim that Apple uses "dark patterns" to turn on iCloud uploads by default, thereby forcing users to "accumulate unwanted data," which is just not the case.
Finally, he concludes his tweet thread by writing:
Liberty is built on due process and limited government. The existential threat here is an unholy alliance between the government to the monopolies who control online discourse and everyone's devices, using the guise of private corporations to circumvent constitutional protections.
Apple addressed concerns with CSAM detection and announced earlier this week that the new safety tools would be implemented later this year with iOS 15 and iPadOS 15.
No password required
A confirmation request will be delivered to the email address you provide. Once confirmed, your comment will be published. It's as simple as two clicks.
Your email address will not be published publicly. Additionally, we will not send you marketing emails unless you opt-in.