If you haven’t been regularly deleting your voice history with Amazon’s voice assistant Alexa, a recently-fixed vulnerability that would have exposed all your conversations with the smart speaker could be a good reason to start.
On Thursday, researchers from the cybersecurity firm Check Point released a report detailing security issues they discovered with Amazon’s Alexa, which would have allowed a potential hacker to get a person’s conversation logs with the smart speaker, as well as install skills on the device without the person knowing.
“The security of our devices is a top priority, and we appreciate the work of independent researchers like Check Point who bring potential issues to us. We fixed this issue soon after it was brought to our attention, and we continue to further strengthen our systems,” an Amazon spokesperson said in a statement.
The company said it had not seen any cases of this vulnerability being used.
The company said they reached out to Amazon in June and the tech giant has since fixed the flaw, but the security concerns serve as a strong reminder to minimize the amount of history logged with your smart speakers.
Connected devices at home present a new opening for hackers, and smart voice assistants are no different. Security researchers have frequently demonstrated flaws with Alexa, like a stranger yelling to unlock your doors to a laser pointer being able to activate your device from 300 feet away.
Many of these concerns are mitigated by the fact that an attacker would need to be near your home or within your speakers’ range to affect it, but Check Point’s security flaws would’ve just needed a single click, researchers said.
Amazon had vulnerabilities with its subdomains — URLs like track.amazon.com, for example. While you might be skeptical enough to avoid clicking on suspicious links, a URL with Amazon’s domain in it could be enough to make you believe that you’re safe.
The security researchers found that they were able to inject code in the subdomain that would allow them to extract a security token tied to your Alexa account. Using that token, a potential attacker could pose as you to install skills, get a list of the skills you are already using, and view your voice chat history with Alexa.
Depending on how sensitive your conversations with Alexa are, it could mean access to your health information, your finances, or just the silly day-to-day stuff you’d ask the voice assistant.
“Smart speakers and virtual assistants are so commonplace that it’s easy to overlook just how much personal data they hold, and their role in controlling other smart devices in our homes,” Oded Vanunu, Check Point’s head of products vulnerabilities research, said in a statement. “But hackers see them as entry points into peoples’ lives, giving them the opportunity to access data, eavesdrop on conversations or conduct other malicious actions without the owner being aware. We conducted this research to highlight how securing these devices is critical to maintaining users’ privacy.”
Check Point said attackers could have started eavesdropping on conversations by installing a skill, but Amazon scans skills for any malicious activities, and blocks them from its marketplace. The voice history log is a bigger concern, and the vulnerabilities are a reminder that you should be regularly deleting your conversations with Alexa.
Like other voice assistant providers, Amazon keeps records of your voice history with Alexa to improve its own artificial intelligence, and unless you opt out, human reviewers will listen to those conversations, too.
You can have your voice history set to delete automatically past 3 months or 18 months, but if you want it deleted every day or every week, you’ll need to do it manually.
With vulnerabilities like this, it’s a good practice to start doing because of the potential for hackers to access those sensitive records, and ask yourself: do the pros of having a history of your conversations with Amazon outweigh the cons?
Keep in mind that even though deleting the voice history could keep you safe from potential hackers, it doesn’t do much for your privacy from Amazon itself.
In a letter to senators from July 2019, Amazon said that it keeps transcripts of voice recordings indefinitely, even when the audio itself is deleted.