laitimes

Alexa, hack yourself: The Amazon Echo may be messing around behind your back

Wen | Du Chen

Edited | Vicky Xiao

Small smart speakers like Amazon Echo Dot and Google Home Mini are very cheap and powerful. According to the survey, the penetration rate of these two smart speaker series reached 35% of households in the United States before the epidemic, and it is expected to reach 75% in 2025.

However, several security researchers from the United Kingdom and Italy recently discovered that Amazon's Echo smart speakers have a rather tricky social engineering vulnerability.

This vulnerability allows an attacker to activate and hijack a speaker and perform various operations behind the user's back. In terms of impact, in addition to violating user privacy, it can also lead to more serious property damage and even the risk of personal injury.

Outrageously, this vulnerability doesn't require any complicated hacking code, and can be achieved by the Echo speaker itself - in simple terms, the Echo speaker uses music and radio skills (skills), and if a specific piece of audio is played, and the audio contains a specific trigger word/instruction, the vulnerability is triggered.

Researchers named the vulnerability Alexa versus Alexa (AvA) – as the name suggests, Alexa (Amazon's virtual voice assistant) hacked himself...

The better the experience, the greater the vulnerability

In order to make smart speakers and voice assistant products more used by more people, manufacturers are studying how to further enhance the experience. However, many features designed to enhance the experience add to the user's problems and may even become the security vulnerabilities discussed in this article.

Virtual voice assistants, including Siri, Google Assistant (GA), Alexa, etc., keep the microphone on because they need to listen for wake-up passwords such as "Hey Siri", "Ok Google", "Hey Alexa" and so on. However, because the recognition is not completely accurate, these virtual voice assistants are often triggered by mistake – a situation that Apple mobile phone users have often encountered in the past.

The researchers found that in addition to inaccurate recognition, the Echo speaker has another problem: its ability to eliminate interference from the sound it emits is not very good. In short, if we ask the Echo to play a piece of audio that contains commands that can control echo to do something else — as a result, echo will give itself orders.

After testing, the Echo that was hijacked in this way can do but not limited to: playing audio files or online radio stations, listening to conversations in the room, adjusting alarm clocks, modifying user calendar items, calling arbitrary numbers, manipulating smart home devices, and even using the owner's Amazon account to buy things online...

Let's take a closer look at how this vulnerability is attacked:

1) First, the hacker makes an audio file that sounds completely problem-free, such as a song, or a podcast, and adds a command to the audio file that activates Alexa/Echo and makes it perform a specific action;

2) Hackers have two attack angles to choose from: within a sufficient distance from the target's home, use the mobile phone Bluetooth to link the Echo speaker, and then play the audio (1.2 in the figure below), or you can directly make the audio into an online radio station, through the social work method to let the target Echo play (1.1 in the figure below);

Alexa, hack yourself: The Amazon Echo may be messing around behind your back

Vulnerability attack method

Note: Echo has the skills to play online radio stations without installing an additional app, which runs in the cloud (as shown on the right side of the image above). And anyone can develop similar skills on their own and publish them to Amazon's Alexa Skills Store. While Amazon conducts security checks on skills released for the first time, developers can still include malicious code in subsequent updates and will not be discovered by Amazon.

3) Echo plays a suspicious audio file, accepts the instructions in the audio, and can carry out various operations without the user's knowledge, causing trouble to the user, such as modifying or even canceling the alarm clock, making people oversleep; turning on and off smart light bulbs, making people think that the home is haunted; modifying calendar items, making people miss important things, etc.;

Don't think these are harmless little jokes, this vulnerability could well lead to more serious privacy breaches, property damage, and personal danger.

Take three scenarios as an example:

1. Privacy breaches: Hackers can add "go on" to suspicious music files. This phrase also corresponds to a skill supported by Echo speakers, which can greatly extend the time that echo remains active and listens to the user's speech; in turn, hackers can also send the user's speech content to a network server. In the most extreme cases, hackers are perfectly capable of combining go on and other skills to completely hijack echo speakers, replacing user-issued commands with their own...

2. Property Damage: In the past, we occasionally saw news that someone had received a package from Amazon, but they hadn't bought anything inside — and with the vulnerability explored in this article, hackers could have hijacked echoes to buy something.

This is because usually Echo will only verify the user's identity and account information when it is first configured after purchase, and then there is no need to do additional identity verification, whether it is to install skills or place orders online - these were originally designed to make the experience smoother, and now may be exploited.

3. Personal injury and serious property damage: If the user has a Alexa-compatible smart door lock installed in the home, the bad guy can connect the speaker at the door bluetooth, play instructions, open the door lock - this can become a serious burglary risk...

This vulnerability is named CVE-2022-25809:

Alexa, hack yourself: The Amazon Echo may be messing around behind your back

CVE-2022-25809

The results of the study have also been written into a paper and placed on arXiv.

Affected products and severity

Worryingly, if you combine the two attack angles mentioned earlier in online radio (remote) and Bluetooth connection (live), the level of danger of this vulnerability is very high:

Remote intrusion of targets can be achieved from anywhere in the world,

Can hijack multiple Echo devices at once,

Attacks can be launched without the use of social work methods,

Can relaunch the attack after the disconnection,

It is even possible to cover up after the first attack is completed and the connection is established, thus enabling a long-term invasion, turning it into a "broiler" and so on...

Alexa, hack yourself: The Amazon Echo may be messing around behind your back

Attack Vectors

Researchers have submitted information directly to Amazon about the main vulnerability, AvA, as well as two other small vulnerabilities found by the way, Full Volume and Break Tag Chain.

The Full Volume vulnerability can increase the volume of content played by the speaker, thereby increasing the chances of hijacking the affected Echo device and other networked Echo devices in the same room;

Break Tag Chain can extend the duration of a particular skill without the user's knowledge, helping hackers to further refine social work attack scenarios.

The verification and reproduction of the vulnerability was carried out on the third generation of Echo Dot, but the researchers pointed out that this vulnerability exists in all Echo smart speaker products of the third and fourth generations.

Amazon rated the severity of the vulnerability as "Intermediate" and recently released a patch update (version number: 3rd generation 6812454788, 4th generation 6409855108) for the affected Echo products.

This patch reduces the chance of the Echo device being activated by the trigger word in the content it plays to some extent, but it does not completely close the vulnerability. Because as we mentioned earlier: this vulnerability is not caused by a code defect, but by functional design.

In principle, as long as the Echo speaker or microphone turn on the listening trigger word at full time, as long as the skill publishing, auditing, and invocation mechanism remains the same, as long as the Echo ensures the user experience and does not authenticate the user at the time of a specific operation - this vulnerability will continue to exist.

The researchers point out that there are several ideas to combat this vulnerability:

1) The ability to suppress the smart speaker to be triggered by the content played by itself: At this point, echo has a similar mechanism design, using a multi-microphone array to more accurately detect the source of voice commands, which is convenient for judging whether the command comes from the user or itself.

2) Detect the sound wave information of the voice command: If the sound wave includes a low-frequency sound wave that the human channel cannot emit, there is a high probability that it is from oneself or another speaker.

3) Strictly use the voice of known users in more scenarios: Many smart speakers will let users say a few more words when they are first set up, so that they can hear who is speaking, so as to complete the operation in a targeted manner. However, at least for the current Echo speaker, it does not verify that the command is from a known user when performing high-risk operations (such as paying for payments and manipulating other smart devices). Amazon should optimize at this point.

If you are reading this article and you are also using an Echo speaker product, you can check whether your device has been updated to the latest version.

Generally speaking, we can still use smart speakers normally, but if you are really worried about the speaker being hijacked, you can completely turn off the microphone on the speaker when you are not in use for a long time (such as before going out), as long as you press the microphone button, the indicator light becomes red.

This article is from the WeChat public account "Silicon Star People" (ID: guixingren123), author: Spectrum Du Chen, 36Kr published with permission.

Read on