720-891-1663

Return to list of client alerts

A New Attack Vector Using Siri, Alexa

I have heard of this before, but now it has a name and demonstrable proof.

It is called a Near-Ultrasound Inaudible Trojan (NUIT).

The sensitivity of the voice-controlled microphones in home Internet of Things devices LIKE, BUT NOT LIMITED TO Siri, Google Assistant and Alexa is amazing.

More importantly, it can pick up frequencies that are above our hearing frequency range (unless there are some dogs reading this blog).

Researchers in Colorado and Texas (UT and CU) led the research.

This is operating system independent, meaning it works against iOS, Android, Google Home, Amazon Echo and Windows Cortana, among others. Attacks against these platforms have been demonstrated online. Other than drawing a map, the hackers have everything they need.

For example, a malicious website could play NUIT sounds in the background. Embedded in these sounds might be “turn down the volume on my smart speaker”, followed by “unlock the door” or “transfer money”.

There is a bit of a crapshoot in terms of understanding what IoT the user has and what the user can use it for, but since these sounds are above human hearing, the hackers do not need to get it right the first time. It might only take 0.77 seconds.

Fundamentally, any device that can accept audio commands may be susceptible to attack. That includes smartTVs. Or iPhones attacked via music playing from a smart speaker. Or vice versa.

In most cases, the inaudible sounds do not even need to be recognizable as an authorized user.

Out of 17 tested devices, only Apple Siri needed to steal the user’s voice – at least for now -the rest can use any voice, including a robot. It can happen, for example, during a Zoom meeting.

One solution is to not have any speakers. That is really realistic, right? Not!

There are dozens of videos demonstrating how this works available online.

Credit: Dark Reading