The “Laser” attack against Home Assistant devices is here

Spread the word!

It's here and how long has it been here? Who is to say.

Given how quickly researchers picked this up is a safe bet it's always been in place, but took matters of curiosity, time and money to pull it off. Kudos.

Siri, Alexa, and Google Assistant are vulnerable to attacks that use lasers to inject inaudible—and sometimes invisible—commands into the devices and surreptitiously cause them to unlock doors, visit websites, and locate, unlock, and start vehicles, researchers report in a research paper published on Monday. Dubbed Light Commands, the attack works against Facebook Portal and a variety of phones. -- Ars Technica

Some tips?

  1. Don't put this device in your house.
  2. Don't put this device in your house.
  3. Don't put this device in your house.
  4. Don't put this device in your house.
  5. Don't put this device in your house.
  6. Don't put this device in your house.
  7. Don't put this device in your house.
  8. Don't put this device in your house.
  9. Don't put this device in your house.
  10. Don't put this device in your house.

Okay, joking aside:

  1. Don't place a home assistant (Google Home, Alexa, et al) near any windows
  2. Don't let it control any serious parts of your house (doors, etc)

That's all I have.

This is a good time to bring up Apple having a technique to do something similar - Remotely disabling cameras in iPhones (patent here).

It’s all based upon the detection of an infrared signal. As per the patent’s abstract, “The image processing circuitry can determine whether each image detected by the camera includes an infrared signal with encoded data. If the image processing circuitry determines that an image includes an infrared signal with encoded data, the circuitry may route at least a portion of the image (e.g., the infrared signal) to circuitry operative to decode the encoded data,” making it so that you wouldn’t be able to take pictures during live performances or, say, in a movie theater. -- Digital Trends

Or other studies in 2018.

With audio attacks, the researchers are exploiting the gap between human and machine speech recognition. Speech recognition systems typically translate each sound to a letter, eventually compiling those into words and phrases. By making slight changes to audio files, researchers were able to cancel out the sound that the speech recognition system was supposed to hear and replace it with a sound that would be transcribed differently by machines while being nearly undetectable to the human ear. -- NYTimes

Let's take this technology seriously and if you do want to use things like this, try to use in-house solutions instead.