Whistleblower: Apple Listens in on Users’ Sexual Encounters, Private Discussions

Tim Cook CEO of Apple laughing
Stephanie Keith/Getty

Apple contractors around the world regularly hear people having sex, discussing confidential medical and legal matters, and committing crimes such as drug deals, during quality control work for the Silicon Valey giant’s Siri virtual voice assistant.

The Guardian claimed to have spoken with a whistleblower who asked to remain anonymous in a Friday-published report regarding “the frequency with which accidental activations pick up extremely sensitive personal information.”

The whistleblower is reported to have said to the Guardian:

There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.

To evaluate the accuracy of Siri’s transcription of user voice commands and improve the natural-language user interface of the service’s software, Apple devices using Siri regularly transmit portions of their recordings to the company’s contractors.

Siri is ubiquitous across current Apple platforms and devices, including the iOS on iPhones, iPadOS on iPads, watchOS on its watches, macOS on its Mac Pros, MacBooks, and iMacs, tvOS on its TVs.

Apple told the Guardian:

A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

Apple has “no specific procedures to deal with sensitive recordings,” claimed the whistleblower, according to the Guardian, adding:

The regularity of accidental triggers on the [Apple] watch is incredibly high. The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.

[Sometimes] you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.

Virtual voice assistant technology is expanding across the technology industry, from Amazon’s Alexa to the Google Assistant.

Apple CEO Tim Cook advocated for censorship of “those who push hate [and] division” across his company’s platforms in a December 2018 conference hosted by the Anti-Defamation League (ADL),

Follow Robert Kraychik on Twitter.

COMMENTS

Please let us know if you're having issues with commenting.