Skip to main content

Your Amazon Echo or Google Home could be fooled by a laser ‘speaking’ words

Your Amazon Echo or Google Home could be fooled by a laser ‘speaking’ words

/

Hey Google, Alexa, Siri: pew pew pew

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

As handy as smart speakers can be, the fact they’re always listening can make them a little creepy — remember how smart speaker makers were caught using humans to listen to voice recordings? Now, researchers have demonstrated a potential new security risk: it’s possible to issue commands to smart speakers with lasers instead of spoken words, as Wired reports.

The researchers found that by changing a laser’s intensity to a specific frequency and pointing the laser directly at a smart speaker’s microphone, they could make the microphone interpret the laser as if it were sound, letting them issue a command to the voice assistant powering the device. And it seems like practically every voice assistant may be vulnerable to this vector of attack, as the researchers say they have tested this on Google Home devices, Amazon Alexa devices, and Facebook’s Portal Mini, as well as some smartphones including an iPhone XR, a sixth-generation iPad, a Samsung Galaxy S9, and a Google Pixel 2.

They’re calling the idea “Light Commands.” Here’s a video overview of how they work (the researchers’ website has a good explanation as well):

And here’s one example of a laser making a Google Home open a garage door. There’s also a video of a laser asking Google Home to tell the time and a video of a laser issuing a command across buildings:

This is wild stuff — but it’s not probably not easy for a potential attacker to pull off.

For one, the attacker needs line of sight to the device they’re pointing a laser at — through a window, most likely. You can remove line of sight by keeping your speaker away from your windows or closing your curtains.

The attack also requires some specialized equipment to make the laser modulate its frequency — though the researchers did provide a list of the components you could use. Many of them can be bought on Amazon, and everything a potential attacker might need costs less than $500. You’ll still need some technical expertise to put everything together, but it’s not extremely cost-prohibitive to rig up a laser for this specific scenario.

Locks and alarms generally already have PIN protection

There are also some protections built into our devices so that make it more difficult for them to be hijacked by a single voice request. Smartphone assistants like Siri generally make you unlock your phone, or listen for a “trusted voice” before they run your commands. And some smart devices won’t activate as easily as the garage door from the video — many locks, alarm systems, and vehicle remote start systems require a spoken PIN before they will work. In theory, an attacker could modulate the lasers to “speak” the PIN, but they’d need to eavesdrop on you first. They could also brute force PIN attempts, but that means a lot more time before they can hijack your device.

The researchers suggest that smart speaker vendors could better prevent this kind of attack by requiring voice commands to be heard from two microphones or by having a light shield in front of the microphone. It’s not clear if the vendors will be making any immediate changes to address this vulnerability, though: Google and Amazon told Wired that they are reviewing the research paper, Apple declined to comment, and Facebook apparently did not respond before Wired published their article.

Given what’s required to pull off this attack, it doesn’t seem likely that you’ll suddenly have to worry about people pointing lasers into your house to open your garage door. But it’s theoretically possible, and the research demonstrates yet another way bringing an internet-connected microphone into your home can be a potential risk to your security.