Google has declared war on the independent media and has begun blocking emails from NaturalNews from getting to our readers. We recommend GoodGopher.com as a free, uncensored email receiving service, or ProtonMail.com as a free, encrypted email send and receive service.
As people shift to working from home in response to measures to stem the spread of the global coronavirus outbreak, a new security threat has reared its ugly head. Lawyers are now warning that smart speakers, such as Amazon Alexa, could be spying on them during critical meetings at home.
U.K. law firm Mishcon de Reya issued advice to staff to mute or shut down listening devices like Amazon’s Echo or Google’s voice assistant when talking about client matters at home. The firm also suggested that staff not have any such devices near their workspace.
The warning from Mishcon covers any sort of visual-enabled or voice-enabled device, such as the aforementioned smart speakers from Amazon and Google. However, Joe Hancock, who heads Mishcon de Reya’s cybersecurity efforts, said that video products such as Ring, also owned by Amazon, as well as baby monitors and even closed-circuit TV, are also a concern.
“Perhaps we’re being slightly paranoid, but we need to have a lot of trust in these organizations and these devices,” said Hancock. “We’d rather not take those risks.”
He added that the firm is worried about these devices being compromised, especially with cheap knock-off devices.
Law firms are currently facing challenges trying to create work-from-home arrangements for specific job functions while maintaining security. Alongside confidential discussions, critical documents and communications also need to be secured. This mirrors the situation faced by banks in Wall Street, where some traders are now being asked to work from alternative locations that banks keep on standby for disaster recovery, instead of from home, to maintain confidentiality.
Smart speakers have already become notorious for activating in error and making unintended purchases, or sending snippets of audio to Amazon or Google. In fact, a report from Consumer Intelligence Research Partners claims that their installed base was 76 million units and growing, which has put them under scrutiny from cybersecurity experts.
For their part, Amazon and Google claim that their devices are designed to record and store audio only after they detect a keyword to wake them up. These companies say that instances of inadvertent activation are rare. However, a recent study by Northeastern University and Imperial College London found that these can happen between 1.5 and 19 times a day.
“Anyone who has used voice assistants knows that they accidentally wake up and record when the ‘wake word’ isn’t spoken – for example, ‘seriously’ sounds like the wake word ‘Siri’ and often causes Apple’s Siri-enabled devices to start listening,” stated the study.
“There are many other anecdotal reports of everyday words in normal conversation being mistaken for wake words,” continued the report. “Our team has been conducting research to go beyond anecdotes through the use of repeatable, controlled experiments that shed light on what causes voice assistants to mistakenly wake up and record.”
One of the more concerning things about these devices is that the companies behind them are actively listening in. Last year, Amazon admitted that not only did Alexa save recorded audio, even if they were deleted, but that its employees were also actively listening in on those recordings.
“This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone,” Amazon said in a statement after the fact came to light.
“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow.”
Despite this, Amazon does not explicitly state in its terms and conditions that employees review customer recordings. That said, the privacy settings for Alexa does offer users the chance to opt-out of helping the firm “develop new features.”
COPYRIGHT © 2017 NEWSTARGET.COM
All content posted on this site is protected under Free Speech. NewsTarget.com is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. NewsTarget.com assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.
Receive Our Free Email Newsletter
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
Once you click subscribe, we will send you an email asking you to confirm your free subscription.