
Silent voices can cause Alexa and Siri to turn against users
Popular speech-recognition systems can be preyed on by hackers via commands inaudible to the human ear and made to do things their owners would rather they did not
In a revelation that will send shivers down the spines of technology lovers fond of scheduling an Uber ride or ordering a takeaway by barking instructions at their gadgets, Chinese researchers have shown how popular speech-recognition systems like Alexa and Siri can be told to open malicious websites and download malware via messages transcribed into ultrasonic frequencies inaudible to the human ear.
These systems’ vulnerability to voice commands emitted at frequencies above 20,000 Hz leaves device owners vulnerable to other types of cyberattack. Owners of smart gadgets could potentially be spied on in their homes through the use of the so called dolphin attack.
Revelations about the relatively basic technique today prompted calls for voice-controllable systems to be redesigned to make them resistant to the silent commands that can be made to emanate from hackers’ own smartphones.
Because microphones on smart gadgets can typically only pick up these commands within a range of a few feet, attacks would probably be most likely to succeed in a crowded public place. A device’s assistant functions must have been activated in the first place for any attempt at manipulation to work.
A Zhejiang University team of academics describe the attacks as “sneaky” in a newly published paper that has been accepted by the ACM Conference on Computer and Communications Security.
They show how operations ranging from beginning an unwanted FaceTime call via an iPhone to altering the navigation system in some types of cars could be initiated without the knowledge of these systems’ owners.
Prior research has already shown that so called 'obfuscated voice commands', which can be picked up by some animals like dogs but are at too high a pitch for people to hear, could affect speech recognition devices.
However, this is believed to be the first time anyone has demonstrated in public precisely what kind of effect they can have.
The Apple iPhone, Google Nexus, Amazon Echo, and various automobiles are said to be vulnerable.
“Inaudible voice commands question the common design assumption that adversaries may at most try to manipulate a [voice assistant] vocally and can be detected by an alert user,” the researchers declare.
They add that they hope their research “serves as a wake-up call to reconsider what functionality and levels of human interaction shall be supported in voice controllable systems”.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.