‘Whistleblower’ Says Apple Contractors Can Hear Disturbing Interactions with Siri

Hey Siri Concept Ios 13 Credit: UX Design
Text Size
- +

Toggle Dark Mode

Real humans may be listening to your interactions with Siri. That could come as a shock to some Apple users because of the company’s privacy-focused reputation.

This week, an apparent subcontractor who reviews Siri recordings for Apple has come forward as a “whistleblower.” Here’s what you should know.

Is Apple Listening?

A small percentage of recordings of user interactions with Siri are sent to human contractors around the world for review, one of the contractors told The Guardian in a recent piece.

Those contractors are then tasked with grading how Siri responds to a request — including determine whether or not a particular interaction was intentional or triggered by accident.

Worryingly, The Guardian’s source also said that Apple’s contractors also occasionally hear uncomfortable or private situations in the recordings. That includes interactions between doctors and patients, business meetings, drug deals, and couples having sex.

Some of the interactions also include sensitive data, like names and addresses.

Apple’s contractors are apparently encouraged to report accidental activations as a technical error, but The Guardian’s source said that there isn’t a specific procedure in place to deal with recordings that contain sensitive information.

While the contractor didn’t mention any signs of abuse, they said they came forward because of concerns of overhearing potentially sensitive information and Apple’s alleged lack of disclosure about the manual review process.

Apple’s Privacy Policies

Apple has always been transparent with the fact that some Siri recordings are manually reviewed. Of course, there’s an argument to be made that this information should be more easily accessible.

“A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple told The Guardian in a statement.

The Cupertino tech giant also added that less than 1 percent of all Siri interactions are actually analyzed — and that they’re only reviewed in secure facilities where contractors are “under the obligation to adhere to Apple’s strict confidentiality requirements.”

It’s important to note that Siri voice recordings cannot be linked back to the user. Apple uses differential privacy techniques to ensure that a Siri recording isn’t linked to an Apple ID or any other personally identifiable details. Instead, Siri requests are linked to rotating device identifiers.

Users can turn off Siri in Settings to effectively “reset” the digital assistant’s memory of a device and get a new device identifier.

Sometimes, a device’s encrypted location is also sent to Apple’s servers for Siri improvement (but on a rotating basis as well). Users can also opt-out of this by turning off Location Services for Siri in Settings.

All user voice data with identifiable information is only saved for about six months. After that, all identifying details are stripped, effectively anonymizing the recording — which may be kept for up to two years to improve Siri.

Voice Assistant Accuracy

If this story sounds familiar, it’s because Amazon and Google were also recently in the spotlight for their own voice assistant improvement policies.

Amazon, for its part, does not anonymize Alexa recordings — which are linked to a user account and device. Like Apple, Google also uses numbered identifiers to link Google Assistant recordings to devices.

While Apple does appear to have some stringent privacy policies in place, the company does not allow users to listen to or delete their interactions with Siri. That’s something that Amazon and Google both allow.

Sponsored
Social Sharing