Apple Will Resume Listening to Siri Audio with Some Important Changes

Siri Iphone Credit: Wachiwit / Shutterstock
Text Size
- +

Toggle Dark Mode

Following revelations last month that Apple had contractors listening to recordings made by Siri as part of a “grading program” to judge the quality of the voice-activated assistant, Apple fairly quickly shuttered the program in order to assuage privacy concerns while reviewing the procedures that it had in place.

According to what the original whistleblower — a former Apple contractor who worked for the program — told The Guardian, many of the recordings contained intimate details of people’s lives, including doctor-patient interactions, business meetings, drug deals, and even people having sex. Later reports from other contractors, however, revealed that although they listened to more than 1,000 Siri interactions per day, many of these were actually pretty innocuous, and every Siri user’s identity was kept anonymous, although in some cases the recorded audio itself may have contained personally identifying information.

The grading program was intended to assess the accuracy of Siri’s responsiveness to various interactions, but the problem stemmed from situations where Siri was accidentally triggered — that is, cases where a Siri device thought it heard the “Hey Siri” activation phrase, and therefore began capturing audio from nearby people in anticipation of being given an actual command.

When Apple suspended the grading program, it announced that it would no longer use human contractors to grade Siri voice recordings while also reviewing its procedures and adding the ability for users to choose whether or not to participate in grading.

Apple’s Solution

Now Apple has released a formal statement, reemphasizing its commitment to privacy as “a fundamental human right” and explaining how it plans to improve Siri’s privacy protections, adding that it has “decided to make some changes to Siri as a result” of the recent grading program scandal.

As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize.

In the statement, Apple makes three commitments to improving its processes regarding Siri audio recordings, including requiring users to deliberately opt-in to having their Siri interactions analyzed, and limiting the scope of the grading program to only involve actual Apple employees, and not third-party contractors.

Apple now maintains that it will no longer retain any audio recordings of Siri interactions, except for those users who specifically give Apple permission to do so by opting into the program. Prior to this, Apple saved recordings of all Siri interactions — deliberate or accidental — for up to two years in various anonymized forms.

Users will also be able to opt out of the program at any time, and Apple promises that recordings triggered by inadvertent activations of Siri will be deleted. What’s less clear, however, is whether the recordings made from a user who once opted into the program will be retained if they later choose to opt-out.

For everyone who doesn’t opt-in, however, although actual audio recordings won’t be stored, Apple does note that it will “continue to use computer-generated transcripts to help Siri improve” suggesting that voice-to-text transcription will be used to still keep records of Siri interactions. If users don’t want transcriptions of their Siri audio to be retained, they will need to disable Siri and Dictation entirely.

Siri’s Current Privacy Protections

In its statement, Apple also naturally touts the privacy steps its already taken, while acknowledging that these obviously haven’t been enough. However, it’s a process that’s far more protective of user information and identity than many competing voice assistants offer.

As with almost every AI feature Apple releases, as much processing is done on the device as possible. For example, Apple’s Photos app handles face and object recognition on a user’s iPhone or iPad, rather than relying on server-side analysis, and Siri listens for “Hey Siri” locally, such that server interactions never occur until the trigger phrase is uttered.

Unfortunately, unlike Photos, Siri needs to interact with Apple’s servers by its very design, although Apple notes that this is done as little as possible. For example, when asking Siri to read your messages, the request is analyzed by Apple’s servers, but the response is to simply instruct your iPhone to actually read the messages locally. The messages themselves are not processed through Apple’s servers for this purpose.

Apple also explains the efforts it goes to in order to anonymize Siri requests so that it can’t tie them back to a specific Apple ID or phone number. Instead, a random identifier is generated from the user’s device in order to associate the Siri requests with the original user, but there’s no easy way to tie this back to an actual person making the request.

Apple has also published a support document, Siri Privacy and Grading, that echoes much of what appears in the company’s formal newsroom statement, while also offering some answers to frequently asked questions, such as whether Siri is always listening and what employees will be able to hear from the recordings if users opt into the program.

Contractors Sacked

Unfortunately, since the folks working on the program were employees of third-party Apple contractors such as Ireland’s Globetech, Apple’s decision to stop using contractors for the program has led to at least 300 people suddenly losing their jobs, according to The Guardian.

The contractors involved in Apple’s Siri grading program had originally been on paid leave after Apple announced its decision to suspend the program back on August 2nd, but have more recently had their contracts terminated, undoubtedly as a result of Apple’s decision to only use actual employees when it resumes the program later this fall.

Sponsored
Social Sharing