Laser pointers can apparently trick smart speakers, phones and tablets into following voice commands to open doors or make purchases, even from hundreds of feet away. Researchers from Tokyo and the University of Michigan have revealed that they were able to take over Google Assistant, Apple Siri and Amazon Alexa devices by shining laser pointers or flashlights at their microphones. One of the researchers, Daniel Genkin, was also part of the team that discovered the Meltdown and Spectre CPU vulnerabilities.
The team has published a paper detailing the light flaw after seven months of experimentation. They were able to hijack smart speakers 230 to 350 feet away by focusing lasers using a telephoto lens. In fact, the Google Home they tricked into opening a garage door was inside a room in another building. The laser modulation they beamed at its microphone port through the window is equivalent to the voice command "OK Google, open the garage door."
They explained that there's a small plate called a diaphragm inside devices' microphones that moves when hit by sound. Lasers can replicate that movement and convert it into electric signals that the device can understand. They said opening the garage door by taking over Google Home was easy to do, and they could've easily made online purchases, opened doors protected by smart locks and even remotely unlocked cars connected to voice AI-powered devices by using the same method.
The researchers have already notified Tesla, Ford, Amazon, Apple and Google about the issue -- a move that's highly important to get the problem fixed, since simply covering microphones with tape wouldn't solve it. Most microphones, they said, would have to be redesigned. The team was able to hijack Google Home/Nest, Echo Plus/Show/Dot, Facebook Portal Mini, Fire Cube TV, EchoBee 4, iPhone XR, iPad 6th Gen, Samsung Galaxy S9 and Google Pixel 2 devices using the technique. It was much easier hijacking smart speakers from afar, though. The method only worked on the mobile devices from a maximum distance of 16 to 65 feet.
This is far from the first digital assistant vulnerability security researchers have discovered. Researchers from China's Zheijiang University found that Siri, Alexa and other voice assistants can be manipulated with commands sent in ultrasonic frequencies. Meanwhile, a group from the University of California, Berkeley found that they can take over smart speakers by embedding commands, which aren't audible to the human ear, directly into recordings of music or spoken text.
https://www.engadget.com/2019/11/05/lasers-voice-commands-smart-speaker/
2019-11-05 07:39:50Z
52780427540759