Researchers Use Ultrasonic Waves to Hijack Digital Voice Assistants

hijack voice assistants
  • Save

hijack voice assistants
  • Save
Photo Credit: Unsplash

Security researchers say they can use ultrasonic waves to hijack voice assistants like Bixby and Siri.

Ultrasonic waves are sounds that humans cannot hear, so the hijacking is imperceptible. The attack can be used to reach messages, make fraudulent purchases, or take pictures and listen in to users without their knowledge.

The ultrasonic hijack of voice assistants has been dubbed SurfingAttack by researchers.

The attack uses inaudible sound waves to activate the targeted smartphone. It can target any smartphone with a digital assistant built-in – including Sonos and Bose devices.

The attack involves attaching a $5 piezoelectric transducer to the underside of a table to hijack a nearby virtual assistant. Researchers could wake up voice assistants and issue commands to make phone calls. They could also take pictures or read any message history – including two-factor authentication texts.

The SurfingAttack was tested on a total of 17 smartphones and found to be effective against most. Researchers say Apple iPhones, Google Pixels, and Samsung Galaxy devices are vulnerable to the attack.

They did not specify which iPhone models are at risk from the ultrasonic attack. The exploit can hijack virtual assistants like Siri, Google Assistant, and Bixby.

Researchers note two phone models that seem immune to the ultrasonic attack. That includes the Huawei Mate 9 and the Samsung Galaxy Note 10+. They believe these phone’s different sonic properties may contribute to their immunity.

The attack does not work on smart speakers like Amazon Echo or Google Home.

Researchers say the primary risk is from covert devices underneath tables or desks. They could be covertly activating smartphones in the vicinity to steal information – such as 2FA codes for email accounts.

The finding comes from a team of researchers from several universities. Washington University in St. Louis, Michigan State University, the Chinese Academy of Sciences, and the University of Nebraska-Lincoln all contributed to the report.

The report first appears at a security symposium in San Diego.