Apple Is Being Sued for Letting Siri Record Users without Their Consent

Siri-Whisper
Text Size
- +

Toggle Dark Mode

After news broke last month that Apple had third-party contractors listening to private Siri conversations, it was pretty much inevitable that lawyers were going to get involved sooner or later, and now it looks like the first shots have been fired, with a class action lawsuit filed in U.S. federal court in Northern California.

First spotted by Kif Leswing of CNBC, the class-action lawsuit is accusing Apple of “unlawful and intentional recording of individuals confidential communications without their consent,” claiming that Apple has violated several laws by doing so, and that the company has made no effort to inform its customers that it was engaging in this behaviour.

Specifically, the lawsuit alleges that Apple has been in violation of California’s Invasion of Privacy Act, Consumer Legal Remedies Act, and Unfair Competition Law since 2011 when the Siri feature was first introduced with the iPhone 4S.

Unauthorized Recordings

The lawsuit hinges on the fact that Apple has been making “unauthorized recordings” of users’ voices, and been doing so not only with the full knowledge that these recordings are commonplace but specifically hiring human reviewers to identify these unauthorized recordings. Yet at no point has Apple informed consumers that they are “regularly being recorded without consent.”

Unlike many jurisdictions where only one party needs to agree to the recording of a conversation, California law prohibits the recording of any oral communications unless all parties consent. Even so, it would be unclear as to exactly who the “other party” would be (if any) in the case of a Siri request.

Apple has sold millions of Siri Devices to consumers. Many of these consumers would not have bought their Siri Devices if they had known Apple was recording their conversations without consent.

To be clear, the lawsuit doesn’t take umbrage with any recordings that were made following the valid use of “Hey Siri” since it tacitly acknowledges that uttering the wake phrase constitutes consent on the part of the user, although it does suggest that Apple still may liable for use of “Hey Siri” by minors, since they would may not be legally permitted to give consent to being recorded under any circumstances.

As evidence, the lawsuit specifically points to last month’s “whistleblower” report from a former Apple contractor who revealed that it was a “regular occurrence for Siri Devices to record nonconsenting individuals” and that such recordings included “confidential medical information, drug deals, and the recordings of couples having sex.” The suit also cites Apple’s 2018 letter to Congress in which it unequivocally responded “No” when asked whether its “iPhone devices collect audio recordings of users without consent.”

Sometimes Siri Is Too Eager

Apple’s statement that Siri only records conversations after it “hears” the phrase “Hey Siri” is technically correct, but the key challenge the company is facing here is the fact that its Siri-enabled devices don’t always accurately recognize the “Hey Siri” wakeup phrase, allowing Siri to be triggered and resulting recordings to be made even when users haven’t explicitly consented.

This is especially true with the HomePod, where Apple doesn’t offer the ability to train it to a specific voice (although this may be coming in iOS 13). Even so, while users of other Siri-enabled devices like the iPhone, iPad, and Apple Watch are far less likely to experience accidental triggers, since these include voice training for the phrase during the setup process, it’s still not impossible for it to happen on rare occasions.

In other words, Siri may think it has consent to record the user when it technically doesn’t. While the legal issues around what constitutes consent are fairly well-established in normal human interactions, it would seem that this case will be more of a legal quagmire, since one of the “parties” is a machine.

Apple’s Dilemma

Apple will likely try and raise a “good faith” defence that it isn’t knowingly collecting unauthorized voice recordings. Further, it would seem that if Apple’s front-end Siri devices like the HomePod can’t always accurately identify the “Hey Siri” key phrase, other automated machine learning algorithms on Apple’s servers aren’t likely to fare any better. This makes actual human review a necessary step if Apple hopes to improve the service in order to reduce the number of false positives.

The anonymization of the data may also be part of Apple’s defence, although legal experts are likely going to have to dig deep to explore this area as well. However it seems unlikely that the anonymous nature of the records will let Apple off the hook, since it’s the very act of making the recording that’s in violation of the law, not so much the process of storing that recording, or the way in which it’s stored.

Further, the claim that minors are unable to give consent at all could make things extremely difficult for Apple. If enforced, it would require them to somehow identify whether the speaker is a minor and refuse to allow them to trigger Siri at all — a much more complicated challenge even for the already very sophisticated machine learning algorithms that Apple has developed.

At no point did Plaintiffs consent to these unlawful recordings. Apple does not disclose that Siri Devices record conversations that are not preceded by a wake phrase or gesture. Plaintiffs Lopez and A.L., therefore, did not agree to be recorded by their Siri Devices, respectively. Moreover, Apple could not have obtained consent from Plaintiff A.L., a minor without an Apple account.

Of course, the other alternative is for Apple to give up on recording any and all Siri activations entirely. It’s uncertain whether this would satisfy the legal definition of “recording,” however, as every Siri request still has to be sent to Apple for analysis, and even if those requests are discarded afterwards, they are still technically “recordings” for the time during which they do exist.

Questionable Claims

As with most class-action lawsuits, the allegations are all-encompassing, yet not all of the claims stand on solid ground. Most notably, the lawsuit explicitly states that it should apply to “all individuals who were recorded by a Siri device without their consent from at least as early as October 12, 2011”, however since much of the lawsuit hinges on the “Hey Siri” wake phrase, it seems very unlikely that this would have been a problem prior to the fall of 2014, when Apple released the iPhone 6 and iOS 8, which was when hands-free Siri activation using the wake phrase was first introduced.

Although it was of course possible to inadvertently activate Siri prior to that by holding down the home button on an iPhone, it’s going to be a little bit harder to argue that the physical action of pressing a button, even if done inadvertently, wouldn’t constitute implied consent, in exactly the same way that it would on any other recording device.

What’s also notable is that the initial plaintiffs in this case, Fumiko Lopez and A.L. (a minor under the care of Lopez), are claiming that the non-consensual recordings occurred on an iPhone XR and an iPhone 6, with “Apple and these Siri Devices unlawfully [recording them] on multiple occasions, including when they failed to utter a wake phase.” While it’s not outside the realm of possibility to experience false Siri activations with an iPhone, the training of “Hey Siri” makes this a considerably less common experience, and it’s become extremely rare with recent iPhone models and iOS versions — a point that will almost surely be brought up by Apple’s lawyers.

Given the concealed and secretive nature of Defendant’s conduct, more evidence supporting the allegations in this Complaint will be uncovered after a reasonable opportunity for discovery.

The plaintiffs are currently undergoing a discovery process to gather more evidence for allegations of Apple’s misconduct, as well asking for members of the class to be identified through Apple’s own records, which they claim should include every person who has ever owned a device capable of making Siri requests going all the way back to the iPhone 4s in 2011. The plaintiffs are seeking unspecified damages as well as court order that would require Apple to delete all Siri recordings.

For its part, Apple has already announced that it’s temporarily suspending its Siri analysis program, although it hasn’t said what it has done with the collected recordings, and it seems most likely that the program is only on hold until Apple can address the privacy issues, which the company says includes adding an option in a future iOS update to allow users to opt out of (or opt into) the program.

Sponsored
Social Sharing