Smartphones were always scrutinized by researchers as they could fall prey to phishing easily. As we told you that Apple and Google employees were spying on you via their respective assistants, a new threat puts smart speakers in the spotlight. The security researchers with Security Research Labs (SRLabs) claim to have found a new vulnerability that could allow hackers to snoop on or even phish undoubting users through Amazon and Google’s smart speakers.
The third-party software for both the smart speakers have to be vetted and approved by Amazon or Google before it can be used with their smart speakers. Here’s a twist: Most often than not, firms do no check the updates to the existing apps which allowed the researchers to sneak ill-disposed code into their software that’s then accessible to users.
This new-found vulnerability gives a slap in our face to keep an eye on such third-party software that is used with the voice assistants. Moreover, the recordings must be deleted time-to-time. The above-outlined vulnerability shows no presence in the real world, though. SRLabs claims that they had revealed their findings to both Amazon and Google before making them public.
Here’s a demo video that exploits the vulnerability.
David Emm, a security analyst at Kaspersky Lab, said, “We all need to be aware of the capabilities of these devices. They’re ‘smart listeners’, not just smart speakers. Their capabilities extend to apps that we use with them.”
Google said it had removed SRL’s Actions. “We are putting additional mechanisms in place to prevent these issues from occurring in the future,” the search giant added. Amazon, too, issued a statement. “Customer trust is important to us and we conduct security reviews as part of the skill certification process. We quickly blocked the Skill in question and put mitigations in place to prevent and detect this type of skill behavior and reject or take them down when identified.”