
Digital Assistants Pose Security Risks
Voice-activated digital assistants like Echo, Alexa, and Siri – have become commonplace for most households and now businesses. This opens up new types of security risks – from issuing commands that aren’t audible to humans to exploiting accessibility settings.
Digital assistants are meant to make our tasks easier
Voice-activated assistants like the Amazon Echo that sits on your counter-top to Cortana on your PC or Siri on your iPhone – are intended to connect users to services from an easy-to-use voice interface. However, these assistants are making cyber-attackers jobs easier.
There is currently an exploit that can bypass the security on a locked Windows PC and other devices found by an independent researcher. Cortana is then able to unlock these devices via this vulnerability – dubbed “Open Sesame”.
Voice assistants prioritize convenience over security as the popularity of these devices and features become more in demand by the consumers. These assistants can be found in every technological device, such as mobile phones, tablets, computers, and now smart speakers. These smart speakers such as the Amazon Echo and Google Home have been popping up in various households and businesses across North America.
There Have been Issues Already
In January 2017, an on-air newscaster said “I love the little girl saying, ‘Alexa ordered me a dollhouse’, prompting Alexa devices in the viewers’ homes to order dollhouses. In May 2018, Amazon Alexa picked up a couple’s conversation, recorded it. and send it to one of their friends.
These examples are already the start in illustrating that voice assistants are simple sensors that are constantly listening to commands, which makes this not only a security issue but also a privacy issue.
Here are some ways that voice assistants can be used to attack:
1) Hiding Commands In Audio – Currently, UC Berkeley’s Carlini used a technique by modifying an audio clip that transcribes one phrase to a 99.9-percent similar clip that transcribes into a completely different phrase. This can even hide commands in music.
2) Machines Can Hear It, You Cannot – Another attack presented in 2017 by six researchers from Zhejiang University showed that they could use sound inaudible to humans to command Siri to make a phone call or do other actions, called DolphinAttack. This can be used to command the device to visit a malicious website, spy on the user, inject fake information or a denial-of-service attack.
3) Is This On? – Even with a voice assistant not taking action on your behalf, it continues to listen to commands. Much like mobile phones, voice assistants are sensors that know a lot about you. This allows companies behind the scenes access privileged places in your home or business – which means making these companies an ideal target for cybercriminals.
4) Device Jumping – Attackers typically find ways through a home or business through the router or unsecured wireless network. Voice assistants add another vector that allows them to bridge attacks by using other devices to issue commands on the devices. The dollhouse example above is a version of this.
Majority of these issues, there is no easy solution or simple fix. Security filters and fixes can make these devices more difficult to use. Users of these devices are asked to be aware of what smart devices you currently have on your network infrastructure and what ways to minimize the exposure. Consulting with a cybersecurity adviser is the best first steps in securing your organization or home from vulnerabilities.