Your Smart Speaker Can Be Hacked with a Laser, But HomePod Users Needn’t Worry

Apple Homepod Iphone X Credit: Tech Advisor
Text Size
- +

Toggle Dark Mode

Hackers are getting more clever all the time, and now a new team of security researchers has uncovered a way to issue voice commands to a HomePod, Google Home, or Alexa-enabled speaker simply by shining a laser at it.

According to Ars Technica, a new research paper was published this week on how “Light Commands” could be used to inject inaudible and often invisible commands into smart speakers or really any other device with a common type of microphone.

The technique works because mics that use microelectromechanical systems (MEMS) technology actually respond to light as if it were sound. It’s not an intentional design on the part of MEMS mics, but rather a side-effect of the way that the technology works that has actually created a security vulnerability.

It’s worth noting that since lasers are being used, the attack in question requires a line of sight to the smart speaker, but it can also easily travel through normal household windows, and the lasers used have enough range to cover quite a bit of distance, so the attacker could be located in another building, such as a house or apartment across the street.

The attack also requires a high degree of precision — the laser can’t simply be beamed at the smart speaker in general, but actually has to be aimed at a very specific part of the microphone, and unless an infrared laser is used, anybody near the device will be able to see the laser, and while the light-based command will be inaudible, the speaker itself would respond to it just as if it were spoken.

Low Cost Attack

Of course, when you hear the word “laser” you may think that this is an attack that requires sophisticated and expensive equipment, but the researchers actually describe a variety of setups that could be used to carry out the attack, the cheapest of which only requires a simple $18 laser pointer and about $400 of additional parts. Standard telephoto camera lenses can also be used to focus the laser for long-range attacks.

That said, there is some technical expertise with lasers required to pull this off, so it’s not something that just anybody would be able to accomplish, but it’s certainly not technology that’s only within the reach of government and law enforcement agencies.

The Potential Risk

Even though the attack isn’t simple to perform, it’s certainly far from impossible, and the danger is that commands could be issued that would do things like unlocking doors, ordering products from Amazon, or even unlocking and starting the target’s car.

Much of this of course depends on how connected the user’s smart speaker it, but Amazon Alexa devices are notorious for offering a wide variety of skills, and of course serving the company’s purpose of helping its customers spend more money on its e-commerce site.

While features like unlocking doors are often secured by PINs, the researchers note that most don’t protect from PIN bruteforcing, so an attacker could easily just send every possible PIN until one is accepted that unlocks the front door.

Why HomePod Users Are Relatively Safe

This is an area where the limitations of Apple’s HomePod and Siri ecosystem works to users’ advantage. There simply isn’t much the smart speaker can do that’s likely to be at risk of an attack like this.

For example, Siri cannot order products from anywhere, nor can it integrate with in-car systems — at least not from the HomePod. While it does provide access to HomeKit commands, it cannot be used to unlock doors or open garage doors unless the user specifically authenticates to their iPhone after making the request. About the most an attacker would be able to do with a HomePod and Siri is to turn your lights on or off, adjust your thermostat, or close and lock your doors. This leaves some room for mischief, but little reason to worry about somebody using it to break into your home, steal your car, or spend your money.

That said, however, this attack should make you think twice about keeping even your HomePod near a window, and if you’re using an Amazon Alexa or Google Home speaker, you may want to consider what types of things you actually let it do with your voice.

Sponsored
Social Sharing