Siri’s Hebrew Voice Artist Sues Apple for Improper, Humiliating Use of Vocals

Dana 260218 24shaot 702 .jpg Credit: Calcalist
Text Size
- +

Toggle Dark Mode

Voice-driven personal assistants like Siri, Google Assistant and Amazon’s Alexa provide users with clear and articulate verbal feedback using the spoken utterances of many talented vocal artists around the world.

Sadly, one of these many talented providers for Apple’s Siri platform is not happy with the company’s alleged “unauthorized use” of her voice and has sued the tech-giant accordingly.

Galit Gura-Eini — an Israeli native and provider of the female voiceover samples for Siri’s Hebrew language voice — filed the lawsuit this week in a Tel Aviv District Court. 

In addition to asserting that Apple “illegally” obtained and used her voice without permission, Gura-Eini charges that people have used her Siri voice to make it say a variety of “sexual, racist, or violent things.”

Express Consent?

According to court documents, however, Gura-Einu — who’s also provided voice samples for Waze Navigation — was among several vocal artists whose voices were recorded several years before Siri was even unveiled.

Specifically, her’s were recorded back in 2007 by a subsidiary of Nuance Communications, an industry-leading A.I. and voice recognition software firm whose renowned speech-to-text platform largely influenced Siri’s early development. 

Gura-Eini allegedly gave Nuance express consent to use her recordings, but only in “legitimate” situations, court documents reveal. The vocalist first became aware of Apple’s use of her voice for its Hebrew-language Siri profile when it was added to the voice platform’s credentials back in 2016.

And while she allegedly approached the iPhone-maker earlier this year seeking to have her voice removed, the company apparently declined, with Apple’s legal counsel noting that it’s done nothing wrong and that Gura-Eini’s voice recordings were not only obtained legally but that they’re not even legally her own, as of the suit’s filing.

“Her voice on the Siri app is nothing but syllables joined together by an algorithm,” a quote from Apple’s legal counsel reads, according to court documents.

Gura-Eini’s legal counsel, meanwhile, asserts that their client’s voice is “widely identified and associated” with her own live persona, and go on to accuse consumers of using Gura-Eini’s voice to make it say “inappropriate things” via Siri — including racist, sexual and derogatory comments — which, they assert, can be likened to turning her voice “into a vehicle for improper and humiliating speech.”

She is seeking $66,000 in damages and legal fees in the case, according to documents reviewed by Calcalist. 

Of course, this wouldn’t be the first time that Apple’s been sued over something Siri-related — but, in a strange twist of irony, it may very well be the first time Siri’s sued back.

Sponsored
Social Sharing