Apple Stops Listening to Customer Recordings over Security Concerns

ANAHEIM, CA - MARCH 24: CEO of Apple Tim Cook is seen at halftime at the 2016 NCAA Men's Basketball Tournament West Regional at the Honda Center on March 24, 2016 in Anaheim, California. (Photo by Harry How/Getty Images)
Harry How/Getty Images

Tech giant Apple has reportedly suspended its global program where the firm’s contractors analyzed recordings of users interacting with the Siri digital voice assistant over security concerns.

Reuters reports that Silicon Valley giant Apple has suspended a program which had human workers manually analyzing and grading the responsiveness and accuracy of Apple’s Siri digital voice assistant. The suspension of the program is reportedly due to security concerns relating to employees hearing private and confidential details from users.

The decision comes shortly after a report from The Guardian last week which claimed that Apple contractors were regularly listening in on confidential information and private conversations. They even reportedly captured sexual encounters. Apple said in a statement: “While we conduct a thorough review, we are suspending Siri grading globally.” The company added that future software updates will give users the option to opt-out of the program.

Apple is not the only Silicon Valley tech firm doing this, in April 2019 a number of Amazon employees told Bloomberg News that the firm employs thousands of workers to listen to Alexa recordings, transcribe them, and analyze them. The employees claimed that they work on average nine hours a day, with each reviewer listening to as many as 1,000 audio clips per shift. Bloomberg reported:

The work is mostly mundane. One worker in Boston said he mined accumulated voice data for specific utterances such as “Taylor Swift” and annotated them to indicate the searcher meant the musical artist. Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word—or come across an amusing recording.

Sometimes they hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress. Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.

Breitbart News published an article on how consumers could stop their Amazon Alexa listening to their recordings, it can be found here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com

.

Please let us know if you're having issues with commenting.