Siri Has Been Updated to Understand Sexual Assault Queries and Provide Help

Siri Has Been Updated to Understand Sexual Assault Queries and Provide Help
Text Size
- +

Toggle Dark Mode

Realistically speaking, it might seem as if there’s very little Apple’s personal voice assistant, Siri, cannot do. From calculating mathematical equations, to locating nearby businesses, addresses, and even scheduling appointments on your calendar, Siri is quite the useful sidekick — helping iPhone and iPad users, near and far, to get to the bottom of whatever task may be at hand.

And now, in light of a recent, collaborative effort between Apple and the Rape, Abuse, and Incest National Network (RAINN), Siri can even respond to certain emergencies — such as sexual assault, or other heat-of-the-moment emergencies, thanks in part to a recent update to its software.

iDrop_SiriSexualAssaultHelp_01An Early Version of Apple’s Assistant Siri

On the 17th of March, 2016, Apple updated Siri’s software to include certain phrases, such as “I was raped” or “I am being abused,” to the iOS virtual voice assistant’s index. To that end, according to ABC News, the Silicon Valley tech-giant programmed Siri’s responses to include direct Web links to the National Sexual Assault Hotline.

Ironically enough, the March 17th update was pushed to devices no fewer than three days after a related study was published in the JAMA Internal Medicine journal. The study, premised around the efficacy of the big four virtual assistant technologies — Apple’s Siri, Google Now, Microsoft’s Cortana and Samsung’s S-Voice — found these protocols to be lacking insofar as offering support for, during, and after emergencies is concerned. The findings, in turn, led Apple to contact RAINN shortly thereafter.

According to Jennifer Marsh, Vice President of Victim’s Services at RAINN, ”We have been thrilled with our conversations with Apple so far, and we have both agreed that this would be an ongoing process and collaboration.”

iDrop_SiriSexualAssaultHelp_02

In order to optimize Siri’s responses to sexual assault-related inquiries, Apple reportedly gathered the most commonly used keywords and phrases received by RAINN via its web-based chat protocol and telephone hotlines. Additionally, the response methodology was also adjusted to incorporate phrases that, all things considered, tend to resemble softer implications. For instance, Marsh was quick to note how Siri now replies to personal emergencies with phrases along the lines of “you may want to reach out to someone,” as opposed to the more formal demand, ”you should reach out to someone.”

And with virtual voice assistants, such as Siri, becoming all the more powerful with each successive update, they could, quite possibly, become medians via which victims can not only report emergencies, but also obtain the support they need to pull through them.

As Marsh was quick to indicate, “The online service can be a good first step, especially for young people. They are more comfortable in an online space rather than talking about it with a real-life person. There’s a reason someone might have made their first disclosure to Siri.”

Apple, for its part, has been aggressively updating Siri’s software so that it’s capable of providing more accurate answers — even despite the voice assistant’s infrequent run-in with botched, technical difficulties.

For instance, back in January, Apple found itself having to address a reported flaw in Siri’s response database, which ultimately led users who were in search of abortion clinics to instead receive results for adoption agencies and fertility clinics. Although, as with everything Apple, that issue was rectified via a subsequent firmware update.

Learn MoreApple’s Patented Technology Will Automatically Make Explicit Content Family-Friendly

What do you think of Siri being able to assist you in emergencies, such as cases of rape or abuse? Would you be comfortable conferring those experiences to a virtual voice assistant? Let us know in the comments below.

Sponsored
Social Sharing