How Amazon Alexa Devices can be Hijacked Using Commands
Reading Time: 2 minutes

Can you believe Amazon Alexa devices can be hijacked using commands from their own speaker? The answer is YES, they can without critical updates these devices can wake themselves up and start executing audio commands issued by a remote attacker. 

According to the University of London, Royal Holloway researchers, hackers can exploit a critical vulnerability in Amazon Alexa devices to gain access and broadcast commands to itself or to other smart speakers nearby. Enabling the threat actor to start “smart appliances within the household, buy unwanted items, tamper [with] linked calendars and eavesdrop on the [legitimate] user.”

According to the researchers Sergio Esposito and Daniele Sgandurra, working with Giampaolo Bella of Italy’s Catania University have dubbed this flaw as Alexa versus Alexa (AvA), describing it as “a command self-issue vulnerability.”

They further added, “Self-activation of the Echo device happens when an audio file reproduced by the device itself contains a voice command.”

The AvA flaw affects both third- and fourth-generation (the latest release, first shipped in September 2020) Echo Dot devices.

According to the research papers, the threat actors can trigger the attack using an Alexa smart device to play crafted audio files to itself. These can be hosted on an internet radio station tunable by an Amazon Echo. To achieve this the threat actors are only required to tune the internet radio station (essentially a command-and-control server, in infosec argot) to achieve control over the device.

Further, it’s just a matter of exploiting Amazon Alexa Skills to execute the attack. According to Amazon these skills, “are like apps that help you do more with Alexa. You can use them to play games, listen to podcasts, relax, meditate, order food, and more.”

Sergio Esposito explained: “It is a language that allows developers to program how Alexa will talk in certain situations, for example. An SSML tag could say that Alexa would whisper or maybe speak with a happy mood.”

He further said, “So, an attacker could use this listening feature to set up a social engineering scenario in which the skill pretends to be Alexa and replies to the user’s utterances as if it was Alexa.”

Anyone can create a new Alexa Skill and publish it on the Alexa Skill store. It requires no special privileges to run on an Alexa-enabled device. However, Amazon says it does check them before allowing them to go live.

Most of the vulnerabilities have been fixed by Amazon except the Bluetooth paired device which could play crafted audio files over a vulnerable Amazon Echo speaker. Though the modus operandi of threat actors is being close enough to connect to the speaker, where the Bluetooth range is around 10m. This means it may be a bigger problem than some just remotely turning on your dishwasher, according to Sergio Esposito.

One of the vulnerabilities has been tracked as CVE-2022-25809 and assigned a medium severity. According to the US National Vulnerability Database, it affects “3rd and 4th Generation Amazon Echo Dot devices,” allowing “arbitrary voice command execution on these devices via a malicious ‘Skill’ (in the case of remote attackers) or by pairing a malicious Bluetooth device (in the case of physically proximate attackers), aka an ‘Alexa versus Alexa (AvA)’ attack.” Amazon has advised its users to “Say, ‘Check for software updates to install software on your Echo device,” 

Related Articles:
Russian Space Research Institute Website Hacked
Hackers Without Borders Co-Founder on NGO’s timely arrival
False Allegations of Police Spyware by Media – Roni Alsheich