Welcome Your IET account
Cat playing with laser pointer red dot

Laser pointer could hack voice-controlled virtual assistants

Image credit: Dreamstime

Researchers have discovered a way to hack smart speakers using an ordinary laser pointer, allowing hackers to send remote, inaudible commands to these household devices.

The attack, dubbed 'light commands', leverages the design of smart assistants’ microphones, which are known as microelectro-mechanical systems (MEMS) microphones and which work by converting sound (voice commands) into electrical signals.

In addition to sound, the researchers – from the University of Michigan (U-M) and the University of Electro-Communications in Tokyo – found that MEMS microphones also react to light being aimed directly at them.

The team said that they were able to launch inaudible commands by shining lasers from as far as 11m at the microphones on various popular voice assistants including Amazon Alexa, Apple Siri, Facebook Portal, and Google Assistant.

“By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” the researchers said in the paper

MEMS microphones feature a small, built-in plate called the diaphragm, which, when hit with sounds or light, sends electrical signals that are translated into commands.

Instead of voice commands, however, the researchers found that they could “encode” sound using the intensity of a laser light beam, which causes the diaphragm to move and results in electrical signals representing the hacker’s commands.

To develop such commands using the laser beam, the team measured light intensity (using a photodiode power sensor) and tested the impact of various light intensities (or diode currents) on microphone outputs.

“We’ve shown that hijacking voice assistants only requires line-of-sight rather than being near the device,” said Daniel Genkin, assistant professor of Computer Science and Engineering at the University of Michigan. “The risks associated with these attacks range from benign to frightening, depending on how much a user has tied to their assistant.

“In the worst cases, this could mean dangerous access to homes, e-commerce accounts, credit cards, and even any connected medical devices the user has linked to their assistant.”

In their study, the team demonstrated that “light commands” could enable a hacker to remotely inject inaudible and invisible commands into smart speakers, tablets and phones in order to commit certain actions.

For example, these commands could be used to unlock a smart-lock-protected front door, open a connected garage door, shop on e-commerce websites at the target’s expense or locate, unlock and start a vehicle that’s connected to a target's account.

According to the team, just five milliwatts of laser power – the equivalent of a laser pointer – was enough to obtain full control over many popular Alexa and Google smart-home devices, while about 60 milliwatts were sufficient in phones and tablets.

To document the vulnerability, the researchers aimed and focused their light commands with a telescope, a telephoto lens and a tripod and tested 17 different devices, which represented a range of the most popular assistants.

“There is a semantic gap between what the sensors in these devices are advertised to do and what they actually sense, leading to security risks,” said Kevin Fu, associate professor of computer science and engineering at U-M. “In Light Commands, we show how a microphone can unwittingly listen to light as if it were sound.”

Users can take some measures to protect themselve, with a postdoctoral researcher in computer science and engineering at U-M, Sara Rampazzi, advising users to simply avoid putting smart speakers near windows or other attacker-visible places.

“While this is not always possible, it will certainly make the attacker’s window of opportunity smaller,” she said. “Another option is to turn on user personalisation, which will require the attacker to match some features of the owner’s voice in order to successfully inject the command.”

Speaking to leading cyber-threat experts, E&T investigated how simple it is to hack Internet of Things (IoT) devices hooked up to the internet, exploring the implications of what this could mean for consumers and critical infrastructure in the UK.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles

Info Message

We use cookies to give you the best online experience. Please let us know if you agree to all of these cookies.


Learn more about IET cookies and how to control them