Freedom of the Press Foundation: Apple’s Device Surveillance Threatens Privacy and Whistleblowers

Tim Cook speaking at Apple event
Justin Sullivan/Getty

In a recent article, the Freedom of the Press Foundation joined many other privacy advocates in expressing concerns over Apple’s recently announced plan to scan photos on user devices to detect child abuse imagery. The organization states that false positives will be a common problem with Apple’s scanning of customer devices, a situation that will put press freedom at risk.

The Freedom of the Press Foundation writes in a recent article titled “Apple’s device surveillance plan is a threat to user privacy — and press freedom,” that Apple’s recently announced plans to scan photos on user devices to detect child sexual abuse material (CSAM) poses a major threat to user privacy.

Apple claims that the way it detects CSAM (Child Sexual Abuse Material) is “designed with user privacy in mind,” and it is not directly accessing iCloud users’ photos but rather utilizing a device-local, hash-based lookup and matching system to cross-reference the hashes of user photos with the hashes of known CSAM.

AFP

However, many privacy experts are apprehensive about the new system. NSA whistleblower Edward Snowden tweeted about the issue stating: “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

The Freedom of the Press Foundation addressed the issue, writing:

Very broadly speaking, the privacy invasions come from situations where “false positives” are generated — that is to say, an image or a device or a user is flagged even though there are no sexual abuse images present. These kinds of false positives could happen if the matching database has been tampered with or expanded to include images that do not depict child abuse, or if an adversary could trick Apple’s algorithm into erroneously matching an existing image. (Apple, for its part, has said that an accidental false positive — where an innocent image is flagged as child abuse material for no reason — is extremely unlikely, which is probably true.)

The false positive problem most directly touches on press freedom issues when considering that first category, with adversaries that can change the contents of the database that Apple devices are checking files against. An organization that could add leaked copies of its internal records, for example, could find devices that held that data — including, potentially, whistleblowers and journalists who worked on a given story. This could also reveal the extent of a leak if it is not yet known. Governments that could include images critical of its policies or officials could find dissidents that are exchanging those files.

These concerns aren’t purely hypothetical. China reportedly already forces some of its citizens to install apps directly onto devices that scan for images it deems to be pro-Uyghur.

The Foundation notes that Apple has claimed it would stand up to any attempts by foreign powers to force the inclusion of non-CSAM images in the company’s hash database, stating in an FAQ document: “Apple would refuse such demands and our system has been designed to prevent that from happening.”

However, Breitbart News has reported extensively on Apple’s ongoing efforts to appease China due to the Chinese market accounting for almost a fifth of the company’s total revenue.

AFP

Read more at the Freedom of the Press Foundation here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com

COMMENTS

Please let us know if you're having issues with commenting.