According to recent reports, Apple has been scanning iCloud e-mails for child abuse imagery since 2019. Many privacy experts are worried about Apple’s decision to begin doing so on user’s local devices.

CNET reports that tech giant Apple has been scanning emails stored in the company’s iCloud service for child abuse imagery since 2019. The news comes as Apple faces increased scrutiny over its recent decision to scan user devices for Child Sexual Abuse Imagery (CSAM), which has worried many privacy experts who have warned that Apple could be influenced into scanning user devices for content other than images of child sexual abuse.

Apple claims that the way it detects CSAM is “designed with user privacy in mind,” and it is not directly accessing iCloud users’ photos, but rather utilizing a device-local, hash-based lookup and matching system to cross-reference the hashes of user photos with the hashes of known CSAM. If there is a match between a user’s photos and the CSAM database, Apple manually reviews the issue and will then disable the user’s account before sending a report to NCMEC.

schoolkids using smartphones

However, many privacy experts are extremely worried about the new system. NSA whistleblower Edward Snowden tweeted about the issue stating: “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

 

Apple recently revealed that it has been scanning iCloud Mail emails for CSAM for two years now, a detail that was not previously disclosed to customers. Apple said on previous versions of its website that it “uses image matching technology to help find and report child exploitation” by looking at “electronic signatures.”

Craig Federighi, Apple’s head of software engineering, said in an interview with The Wall Street Journal last month: “If you look at any other cloud service, they currently are scanning photos by looking at every single photo in the cloud and analyzing it; we wanted to be able to spot such photos in the cloud without looking at people’s photos.”

Many privacy experts including Edward Snowden, the Freedom of the Press Foundation, and the Electronic Frontier Foundation (EFF) have all taken issue with Apple’s plans to extend this scanning to the local device level. Unlike scanning materials uploaded to the cloud, the device scanning led the EFF to comment: “Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor. We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content.”

Read more at CNET here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com