Hacking

Researchers Use Lasers to Hack Smart Speakers

University researchers have discovered a way to issue unauthorized commands to digital assistants like Alexa, Google Assistant, Facebook Portal and Siri via laser beams.

The microphones in devices like smart speakers, mobile phones and tablets convert sound into electrical signals, but what the researchers found was that the mics react to light aimed directly at them, too.

The research team — Takeshi Sugawara of The University of Electro-Communications in Tokyo; and Benjamin Cyr, Sara Rampazzi, Daniel Genkin and Kevin Fu of the University of Michigan — this week posted a paper online describing their findings.

“By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” they explain at Light Commands, a website dedicated to the project.

Using that technique, silent “voice” commands can be sent to the digital assistants that execute tasks through their devices — tasks such as controlling smart home switches, opening garage doors, making online purchases, unlocking and starting motor vehicles and unlocking smart locks.

“Potentially anyone can do it,” U-M researcher Rampazzi told TechNewsWorld. “It requires a little bit of equipment and knowledge, but it’s not that complicated.”

All that’s needed for an attack are a simple laser pointer (US$13.99, $16.99, and $17.99 on Amazon), a laser driver (Wavelength Electronics LD5CHA, $339), a sound amplifier (Neoteck NTK059, $27.99 on Amazon), and a telephoto lens (Opteka 650-1300mm, $199.95 on Amazon) for focusing the laser for long range attacks, according to a parts on the Light Commands site.

Threat to Consumers

There may be some practical obstacles for an attacker, noted Kaushal Kafle,a research assistant studying the security of smart home and Internet of Things platforms in the Department of Computer Science at William & Mary in Williamsburg, Virginia.

For example, a clear line of sight is necessary from the laser to the target. In addition, as the distance to the target increases, focusing directly on a microphone becomes more difficult.

However, “even if there are practical limitations, given the impact of such an attack, it is important for users to be aware of such attacks and place their smart speakers in a safe location — not one that is easily visible from outside through a window, for instance,” he told TechNewsWorld.

The kind of laser attack described by the researchers can pose a serious threat to any consumer targeted by such an attack because it gives an attacker free rein over a device.

“Anything the user can say, we can use the laser to do the same command,” U-M researcher Cyr told TechNewsWorld.

“Anything you can do by voice becomes a threat,” added Rampazzi.

The damage an attacker can do, though, is limited to a consumer’s preferences.

“Given that all three major voice assistant systems are heavily pushing for smart home integration, there are obviously dangers depending on the user’s setup,” Kafle said.”In the case of smart homes, it depends on what kind of smart devices are connected to the user’s voice assistant for access through voice commands.”

MEMS Mics Targeted

While the researchers focused on the major digital assistants, their work could have broader ramifications. During their experiments, they found that their laser attacks worked best on MEMS microphones.

Applying MEMS (microelectro-mechanical systems) technology to microphones led to the development of small microphones with very high performance characteristics — characteristics that are just the ticket for devices like smartphones, tablets and smart speakers. MEMS microphones offer high SNR, low power consumption and good sensitivity, and they are available in very small packages.

The reason laser attacks work on MEMS mics, Rampazzi explained, is that those microphones have a different construction and design compared to conventional mics.

Consumers can give themselves a measure of protection from laser attacks by requiring authentication for critical tasks like unlocking doors and starting up cars. That’s done by limiting the execution of those commands to a single voice.

“It doesn’t make the attack completely impossible, because you can record the voice of the owner or use a similar synthetic voice, because voice personalization is not so accurate yet,” Rampazzi explained.

“It is important for all voice systems to integrate user authentication, especially before carrying out security-sensitive actions or financial transactions,” noted Kafle.

Asking a user a simple randomized question before command execution can be an effective way of preventing an attacker from obtaining successful command execution, according to the researchers’ academic paper on laser attacks.

“However, we note that adding an additional layer of interaction often comes at a cost of usability, limiting user adoption,” they wrote.

Industry’s Role

The industry must assume a role in preventing laser attacks on gadgets.

“When it comes to manufacturers and vendors of smart devices, it is important for them to recognize this attack surface, and not only develop practical solutions but also raise the user’s awareness about such attacks, because it can be avoided in most cases through device placements,” Kafle pointed out.

Hardware makers could use sensor fusion techniques to block light-based command injection, the research paper notes.

Devices using voice commands typically have multiple microphones. Since sound is omnidirectional, a voice command will be detected by more than one mic.

On the other hand, in a laser attack, only a single microphone receives a signal.

A manufacturer could take that into account and have the device ignore commands issued through a single mic. However, that measure could be defeated by an attacker simultaneously injecting lights on all the microphones on the device using wide beams, the researchers acknowledged.

For sensor-rich devices like phones and tablets, sensor-based intrusion detection could block light injection attacks, because light commands will appear to the sensors as very different from audible commands, the researchers suggested.

“I find the idea of using lasers to issue voice commands as interesting but not compelling,” observed Chris Morales, head of security analytics at Vectra Networks, a San Jose, California-based provider of automated threat management solutions.

“I think it is a very cool trick and shows the susceptibility of new technology to techniques we have not thought of before, but the execution of such a technique seems like an advanced way to achieve a goal that could be much simpler,” he told TechNewsWorld.

“For the regular consumer, I don’t foresee a rash of home robberies using lasers to open doors,” he said, “when all one needs to do is throw a rock through a window.”

John P. Mello Jr.

John P. Mello Jr. has been an ECT News Network reportersince 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, theBoston Phoenix, Megapixel.Net and GovernmentSecurity News. Email John.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by John P. Mello Jr.
More in Hacking

What's your outlook for the business climate in 2025?
Loading ... Loading ...

Technewsworld Channels