Study Reveals That Malicious Google Assistant,Alexa and Siri Commands Can Be Hidden in Music and Speech Recordings

Author Photo
May 10, 2018
14Shares
Submit

I’m still not sold on the concept of speaking to home appliances, but that’s the direction the world is headed, and we’ll have to accept it. It turns out, anything that can be spoken to is also prone to subliminal messaging. Even virtual assistants such as Alexa, Google Assistant and Siri are not safe from commands that slip by unheard to the human ear. According to a research conducted by a group of students from Berkley, hidden commands can be issued to popular virtual assistants using recordings or music. According to an NYTimes article on the issue:

Over the past two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.

Sounds absurd, doesn’t it? The researchers achieved this by making changes to audio files, which cancelled out the sound that the speech recognition system was supposed to hear and replace it with a sound that would be understood differently by machines while being virtually undetectable to the human ear.

blackberry-2-4Related Blackberry to Acquire Cylance for $1.4B

They were even able to hide the command, “O.K. Google, browse to evil.com” in a recording of the spoken phrase, “Without the data set, the article is useless.” How did they do that? We don’t quite know yet. Last year, researchers at Princeton University and China’s Zhejiang University demonstrated that voice-recognition systems could be activated by using frequencies inaudible to the human ear. Here’s a video of it in action

The technique has its limitations, though. The receiver must be close to the device, but a more powerful ultrasonic transmitter can help increase the effective range. Researchers at the University of Illinois demonstrated ultrasound attacks were possible from 25 feet away. You’ll still need a direct line to the device, as the commands are incapable of penetrating through walls.

The potential of misuse of the technique is immense. For example, an attacker can easily point a transmitter in the general direction of a Smart Speaker and as it to unlock the door or your pesky neighbour who you hate could add a few hundred too many of something to your Amazon shopping list and the list goes on. This isn’t the first time we’ve seen smart devices go haywire due to ambient factors. An episode of South Park had millions of Google Home and Alexa speakers spewing obscenities thanks to their respective hot words being repeatedly broadcasted. With almost all virtual assistants getting more features, its time we address the inherent security loopholes they open up.

Source: 9to5google

Submit