Stories of Smart Spies: How Third Party Apps Could Easily Spy on Amazon Alexa and Google Home Users
Google Home and Amazon Alexa are known security threats making it possible and easy for both the tech giants and criminals to ultimately gain access to a user's home. New threat made possible through devices has now been identified by whitehat hackers of Security Research Labs.
The team developed eight apps as four Alexa "skills" and four Google Home "actions" that passed through Amazon and Google vetting processes but were actually spying on users. SRLabs team created these apps for each platform with legitimate use (like a horoscope app), but hid malicious code inside.
SRLabs proves Alexa and Google Home are nothing but "smart spies"
The team created voice applications to demonstrate the vulnerability of both these platforms, turning the assistants into what they call "Smart Spies." They focused on the following "building blocks" to eventually compromise user privacy by both collecting personal data - including passwords - and eavesdropping on users after they believed their smart speaker had stopped listening.
a. We leverage the “fallback intent”, which is what a voice app defaults to when it cannot assign the user’s most recent spoken command to any other intent and should offer help. (“I’m sorry, I did not understand that. Can you please repeat it?”)
b. To eavesdrop on Alexa users, we further exploit the built-in stop intent which reacts to the user saying “stop”. We also took advantage of being allowed to change an intent’s functionality after the application had already passed the platform’s review process.
c. Lastly, we leverage a quirk in Alexa’s and Google’s Text-to-Speech engine that allows inserting long pauses in the speech output.
In one demonstration, SRLabs researchers used an unpronounceable text string during which time speaker remained silent but it was active to trick users into giving up their passwords as this silent string was followed by the speaker saying; "An important security update is available for your device. Please say start update followed by your password."
SRLabs also said that they changed the functionality of the voice app after it was reviewed by Google or Amazon and this change didn't prompt a second review, leaving it up to the third party app developers to include whatever malicious code they wanted.
They also listened in on conversations after a user had used the "stop" command to stop the app, demonstrated in the following video:
SRLabs has now published their research and videos. Researchers warned that Alexa and Google home are incredibly powerful devices that can be used to eavesdrop on users in private environments. "The privacy implications of an internet-connected microphone listening in to what you say are further reaching than previously understood," SRLabs writes, adding that users need to be more aware of using a new voice app that abuse their smart speakers.
"It was always clear that those voice assistants have privacy implications - with Google and Amazon receiving your speech, and this possibly being triggered on accident sometimes," Fabian Bräunlein, senior security consultant at SRLabs, told the folks at ArsTechnica.
"We now show that, not only the manufacturers, but... also hackers can abuse those voice assistants to intrude on someone's privacy."
Google in its statement said that it's putting additional mechanisms in place while Amazon said that it has implemented mitigations to prevent and detect this type of skill behavior. But years of history is proof enough that there will always be loopholes. While smartphones have become a necessity, these smart speakers are probably not even worth the risk.