Cybersecurity

Smart Speaker Apps Caught Snooping Around Homes

Flaws in Amazon and Google smart speakers can expose users to eavesdropping and voice phishing, security researchers reported Sunday.

Researchers at Security Research Labs, a hacking research collective and consulting think tank based in Berlin, Germany, discovered that developers could create malicious apps for the Amazon and Google platforms to turn the smart speakers into smart spies.

Using the standard development interfaces for the platforms, the researchers found a way to request and collect personal data from users, including passwords, and to eavesdrop on users after they believed the smart speaker had stopped listening.

Although Amazon and Google review the security of voice apps before they’re used on their platforms, developers are allowed to make changes after a review is completed. That allowed the researchers to add malicious code to their voice apps after they were vetted by Amazon and Google.

“Consumers need to be aware that they are sending data to third parties when using voice apps,” explained SRL researcher Karsten Nohl. “These apps do not need to be installed on the device, but instead are invoked through phrases that the app developer chooses.”

“Hence, users might not be aware they are using the services of a third party,” he told TechNewsWorld. “The smart spies hacks add urgency to this situation since they allow app developers to listen in on users after the app has supposedly stopped running.”

Be Concerned, Be Very Concerned

Consumers should be concerned about the SLR researchers’ findings, said Charles King, principal analyst at Pund-IT, a technology advisory firm in Hayward, California.

“In essence, SRL demonstrated that apps with malicious functions and features can pass the vetting processes at both Amazon and Google,” he told TechNewsWorld.

“That, along with news about employees at both companies breaching customers’ privacy by listening to conversations, should give anyone second thoughts about using Amazon Alexa and Google Home products,” King said.

The SLR report adds to a growing body of research that shows smart speakers degrade privacy, noted Parham Eftekhari, executive director of the Institute for Critical Infrastructure Technology in Washington. D.C.

“Whether it’s audio collected for R&D purposes by device manufacturers or vulnerabilities exploited by hackers like what’s being discussed in this report, consumers need to understand that the convenience and functionality that smart speakers bring into their lives comes at a cost,” he told TechNewsWorld.

Of the two exploits mounted by the SLR researchers, voice phishing should give consumers the most concern, observed Blake Kozak, the lead analyst for smart home research at IHS Markit, a research, analysis and advisory firm headquartered in London.

“Eavesdropping would be far less valuable and useful for hackers,” he told TechNewsWorld. “It is more plausible a hacker would use a vulnerable device for mining bitcoin rather than filtering and analyzing 20 seconds or hundreds of hours of recordings from potentially millions of people to find something useful.”

Bashing Bad Apps

After being alerted of the malicious “skill” — Amazon’s name for the apps that work with its smart speaker platform — the company blocked the particular one created by the researchers and put changes in place to reject or block any skill exhibiting the nasty behavior of the researchers’ app, according to information provided to TechNewsWorld by Amazon spokesperson Samantha Kruse.

The company said it has safeguards in place to block or take down skills that make requests for Amazon passwords.

Although Amazon addressed the flaws SLR uncovered, it noted that it had not seen any skills asking customers for passwords or displaying the other behaviors in the SLR apps.

“It’s important that we continue to work with the security community to protect our customers,” the Amazon spokesperson said. “When alerted to potential security issues, we work to develop mitigations and will continue to do so for anything else that gets reported to us. We have a dedicated team focused on certifying skills and ensuring the safety and security of our customers.”

Google did not respond to our request to comment for this story.

Serious About Security

“To their credit, Amazon and Google have been pretty good at fixing problems once they learn about them,” Pund-IT’s King said.

“I’d say that they deserve the benefit of the doubt insofar as doing the right thing here,” he continued, “but I’d still leave smart speakers unplugged until I see evidence that goodwill is justified.”

Amazon and Google have done a lot to identify security holes in their platforms, observed Jim McGregor, principal analyst at Tirias Research, a high-tech research and advisory firm based in Phoenix.

“These companies are taking security seriously,” he told TechNewsWorld, “but it only takes one bad apple to screw things up, either on the security side or the privacy side.”

Both Amazon and Google could do a better job informing consumers about what kind of information is being collected and how it can be accessed, observed John Wu, CEO of San Diego-based Gryphon, maker of a secure WiFi router.

“Not only is it hard to access that information, but sometimes the information being collected is information we didn’t think was being collected,” he told TechNewsWorld. “We found that some of these devices continue to record audio for 10 to 15 minutes after a task is completed, which is concerning to us.”

Recent announcements from Amazon and Google suggest they’re getting the message about consumer control of data. Amazon has added a feature in its Alexa digital assistant that allows a user to delete everything it has recorded, as well as ask it what it heard and why it responded in a certain way.

Google, too, has given its assistant the power to destroy data when commanded by a user to do so.

“That’s a step in the right direction that gives users more control over their own data,” Wu said.

Open Source vs. Walled Garden

Amazon and Google can use the work of white hat hackers like SLR to prevent future exploits, IHS’s Kozak explained, but their biggest challenge is identifying malicious skill creators.

“That means tightening the ecosystem and choice of potential partners,” he said, “but this approach is both positive and negative.”

When Google tried to tighten its Nest ecosystem by more closely vetting partners and moving customers to use Google exclusively, the move was met with disdain from consumers and the developer community, Kozak pointed out.

“Consumer and developers want every platform to be open and to work with any and all devices,” he said, “but this leaves consumers and providers vulnerable to malicious activity.”

If SLR’s findings illustrate anything, it’s that consumers need to treat their smart speakers as they would their more conventional devices.

“Consumers need to get in the mindset that installing new invocations or intents on a smart speaker is not a lot different from installing a program on your computer or phone,” noted Craig Young, computer security researcher withTripwire, a Portland, Oregon cybersecurity threat detection and prevention company.

“Unfortunately,” he told TechNewsWorld, “the onus is largely on consumers to be diligent in vetting content on these new platforms.”

John P. Mello Jr.

John P. Mello Jr. has been an ECT News Network reportersince 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, theBoston Phoenix, Megapixel.Net and GovernmentSecurity News. Email John.

1 Comment

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by John P. Mello Jr.
More in Cybersecurity

What's your outlook for the business climate in 2025?
Loading ... Loading ...

Technewsworld Channels