Researchers Reveal ‘Hidden’ Messages Siri Can Hear But You Can’t

Homepod Iphone X Credit: iMore
Text Size
- +

Toggle Dark Mode

U.S. and Chinese researchers have discovered a method to give digital assistants “secret” commands that are inaudible to the human ear.

Those researchers have found a way to use undetectable hidden messages to invoke A.I. assistants like Siri and Alexa. Those same messages could issue commands to the digital assistants, having them perform tasks that their owner didn’t intend, The New York Times reported.

These commands can be embedded within a song, spoken text or other audio recordings. When these tracks are played near an Amazon Echo or iPhone, nearby users would only hear the detectable audio. But digital assistants would hear and respond to those hidden commands — all without their owners even knowing

They can instruct a digital assistant to dial phone numbers, purchase online items, or access websites. But they could carry out even more malicious tasks, like covertly wiring funds to an account or opening smart doors or locks.

The worrying implications were most recently highlighted in a research paper written by students from the University of California, Berkeley and Georgetown University and published this month.

Last year, a joint team of Chinese and U.S. researchers performed similar tests — using a method they dubbed “DolphinAttack.” And another team and the University of Illinois corroborated the findings and demonstrated that the commands could work as far as 25 feet away from a smart device.

The exact method that the Berkeley and Georgetown team used shouldn’t be a concern for most people. But the researchers added that similar technology could already be under development by malicious entities.

Nicholas Carlini, one of the most recent paper’s authors, told the Times that he’s “confident” his team would soon be able to deliver commands that could exploit any smart device currently on the market.

Apple has mechanisms in place across many of its devices that could mitigate the risks of these attacks. For example, iOS devices require a passcode to carry out Siri commands that unlock smart doors. On the other hand, HomePod lacks these security features.

In the meantime, Carlini hopes to bring awareness to the issue so that firms can implement more stringent fixes. “We want to demonstrate that it’s possible,” he told the Times. “And then hope that other people will say, ‘O.K. this is possible, now let’s try and fix it.’”

Sponsored
Social Sharing