Apple Stops Letting Contractors Listen to Your Siri Conversations — For Now

Apple Homepod Siri Credit: Apple
Text Size
- +

Toggle Dark Mode

After an alarming report last week revealed that Apple had third-party subcontractors listening to Siri recordings — some of which contained intimate details of people’s lives — Apple has now responded by shutting down the program entirely, according to a new report from The Verge.

Last week a “whistleblower” came forward who claimed to work as a subcontractor for Apple, assigned to a team responsible for analyzing and grading how Siri responds to requests. As part of their responsibilities, these contractors — who are not actually Apple employees — listen to snippets of recordings made by Siri on various Apple devices to determine whether or not Siri activations were intentional or accidental.

This latter part is where the problem seems to have come in, since the anonymous subcontractor noted that members of the group often hear uncomfortable or private situations as part of the recordings, including doctor-patient interactions, business meetings, drug deals, and even people having sex, all of which are most likely due to accidental triggering of Siri. While contractors are supposed to report all cases of accidental activations, the source indicated that there was no procedure in place for dealing with recordings that actually contain sensitive personal information.

Apple’s Response

In an initial response to the report, Apple pointed back to its privacy policies, stating that it’s always been clear that some Siri requests are analyzed by humans to improve the service, but emphasized that this is less than one percent of all Siri interactions, and that they’re all “reviewed in secure facilities” and contractors are “under the obligation to adhere to Apple’s strict confidentiality requirements.” Still, due to the transient nature of contractors, it’s safe to say that people might be a little bit more comfortable if these were actual Apple employees, rather than folks who could find themselves out no the street a week later.

Apple also emphasized that the voice recordings cannot be linked back to an actual user. While the original whistleblower said that contact information was available for some recordings, it seems likely they were speaking of the content of the recordings themselves (i.e. a recording of somebody saying a name or phone number), and not information collected by or stored by Apple.

Shutting It Down

Now, however, Apple has gone a step further and said that it’s going to be suspending the program completely — at least for now. That is, it will no longer use human contractors to grade Siri voice recordings,

We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.

Apple spokesperson, in a statement to The Verge

Apple has also said that it will be adding the ability for users to opt out in a future software update — or perhaps, more appropriately, be asked whether they want to opt-in in the first place. Although it stopped short of saying when that would be coming, based on how quickly Apple usually responds to hot-button issues like this, we wouldn’t be surprised to see it in iOS 13, or perhaps even sooner in an iOS 12.4.1 or 12.5 update.

Notably, however, Apple didn’t say whether it would stop actually saving Siri recordings on its servers, merely that it won’t be allowing humans — or at least human contractors — to listen to them. Currently Apple says that it stores recordings for an initial period of six months, after which it removes all identifying information for longer-term storage. However, since Apple claims that none of the requests are associated with an Apple ID in the first place, nor should they be traceable back to a person, it’s not entirely clear exactly what this “identifying information” includes. Previous reports have suggested that recordings are tagged with hardware device IDs of some kind, which would theoretically allow them to be traced back to the original speaker, in a roundabout way.

It’s also worth noting that Apple has only said that it’s temporarily suspending the program, so it’s entirely possible that it will start back up again, although we’re guessing that will only happen after Apple has the necessary software updates out to make user participation optional, rather than mandatory.

Social Sharing