Apple has generally taken a very slow and measured approach to opening up its products and features to third-party developers, and while the company is often loudly criticized for its reluctant in doing so, it’s hard to argue that this usually results in much better security and privacy for its users.
The latest battlefield where this is now becoming more readily apparent is in the smart speaker market. While Apple hasn’t completely avoided privacy problems with Siri, it’s been quick to respond and fix problems, and it’s still miles ahead of the competition when it comes to respecting user privacy.
However, as much as companies like Amazon and Google may be guilty of listening in on their users’ smart speaker interactions, it seems that they’ve done even less to protect users from what third-party apps could be doing with their devices.
According to a new analysis by Security Research Labs (SRLabs), third-party developers can abuse “Skills” for Alexa and “Actions” for Google Home to listen in on users or even “voice-phish” passwords. SRLabs specifically found two possible hacking scenarios that could apply to either Amazon’s Alexa or Google Home that would allow third-party apps to turn these digital assistants into “Smart Spies.”
As a proof-of-concept, SRLabs created a set of voice applications that exploited vulnerabilities in Alexa and Google Home, along with the almost non-existent vetting process that’s employed by Alexa and Google for updates to skills and actions that run on their smart speakers.
The apps in question purported to be legitimate skills like horoscope apps that actually hid malicious code that was added after the app had already been approved by Amazon and Google — while both companies perform some level of security review for new skills and actions, neither checks at all on updates, even though developers can completely change the functionally of an “intent” within the app when updating it.
Using three specific behaviours common to both Alexa and Google Home, the SRLabs researchers were able to “phish” for user’s passwords and eavesdrop on users after they thought the speaker had stopped listening.
In the first example, SRLabs demonstrates how a seemingly innocuous app could be triggered to ask for a user’s password in such a way as to dupe an unsuspecting user to provide it.
In this case, the application is designed to play a fake error message in place of the normal welcome message that will make the user believe that the app has not actually started. It then inserts what sounds like a lengthy audio pause by “speaking” unpronounceable characters, before playing a message to try and trick users into revealing their password, such as letting them know that an important security update is available and they need to supply their password to install it.
Of course, this can be used not only for passwords, but for any other types of information that a user could be led to believe the Alexa or Google Home might request, such as asking a user to “confirm” a phone number, email address, or shipping address.
Using a similar method of misleading intents and long pauses, it was also relatively trivial for the researchers to create an eavesdropping app that could be used on either Amazon’s Alexa or Google Home.
The trick here is basically to exploit the ability to redefine the “stop” intent to keep the speaker listening, again by telling the assistant to say a long series of unpronounceable characters, during which time it’s still listening for, and potentially recording, everything that’s being said in the room.
What About the HomePod?
Apple’s HomePod isn’t mentioned in the report simply because there are no third-party “skills” available for the HomePod itself. The closest Apple’s HomePod comes to providing third-party app support is through SiriKit apps that run on a paired iPhone, iPad, or iPod touch, which work quite a bit differently.
In fact, this provides even more justification for why Apple’s expansion of Siri has been relatively slow compared to other voice assistants. Each new category of app that can access information via Siri comes with its own unique security and privacy concerns, so Apple’s choice to phase them in slowly with each new major iOS release is probably a wise one.
While it would be very theoretically possible for an app to have a SiriKit vulnerability that could be exploited via the HomePod in some way, the nature of how SiriKit works would make this extremely difficult. In short, there are no open eavesdropping channels, nor is there any easy way for an app to randomly speak through a HomePod without being triggered from the user’s mobile device. Of course, as features like Siri Shortcuts become more sophisticated, it’s only a matter of time until exploits like these become a bit more feasible on a HomePod as well, but even in the case of Alexa and Google Home, it’s the lack of oversight by Amazon and Google that has contributed most to this problem, and these are the sort of things that are far less likely to slip by Apple’s review process.