Eavesdropping WFH coworkers: Alexa, Siri, and Google

Working from home, even during a stressful time like the coronavirus pandemic, has its advantages. Meetings become emails. Suits can be traded for jeans (or, more likely, sweatpants). And, whether you realize it or not, you have a new coworker that likes to eavesdrop—your smart speaker device.

Smart speakers don’t constantly record our conversations. Instead, they are triggered by “wake words,” such as “Alexa” or “Ok Google.” But they do accidentally wake up and record even when the wake word isn’t spoken. According to a 2020 study from Northeastern University, smart speaker devices accidentally activate as many as 19 times a day, recording as much as 43 seconds of audio each time.

Those audio recordings are generally stored on servers, but some companies allow users to opt out of storage or delete recordings. Amazon, Apple, and Google all employ people to listen to recordings for the purpose of improving speech recognition, but after some high-profile privacy breaches, Amazon and Apple now allow users to opt out of the review process. Google was forced to pause human quality checks so that Germany could investigate whether the practice complies with GDPR requirements. Unsurprisingly, smart speaker data has been the subject of hackers, and even the device itself can be hacked if an internet connection is compromised.

Regulation of the smart speaker device industry in the United States is developing fast but struggling to keep pace with technology.

  • Oregon and California recently passed laws that require makers of smart speaker devices to equip them with “reasonable security” features, and a number of other states are considering similar legislation.
  • Proposed legislation in California would prohibit the use or sale of audio recordings containing personal information and require makers of smart speaker devices to get permission from a user before storing recordings.
  • Similar proposed legislation in New York would require manufacturers to obtain user permission before storing recordings, using recordings for advertising purposes, or selling or sharing recordings with third parties.
  • Illinois is considering legislation that would require makers of smart speaker devices to disclose to users on their websites the names of all third parties receiving personal data. Courts have ordered smart speaker recordings to be turned over In the meantime, consumers have brought suits against smart speaker device makers under various privacy laws. Amazon, Apple, and Google have been hit with lawsuits under the California Invasion of Privacy Act, the Massachusetts Wiretap Statute, and other similar state laws by customers who allege that smart speakers have recorded and stored audio without their consent. Two class action lawsuits against Amazon allege its Alexa voice assistant is recording and storing children’s voices without their parent’s consent.

The capture, storage and use of audio by smart speaker devices raises a number of concerns—some which might be resolved through design fixes and improvements, but others than might only be resolved through legislation or the judicial process. For now, you might just want to mute or shut off your smart speaker during work hours.

Previous
Previous

Texas AG: Governmental bodies can limit public comment at open meetings

Next
Next

Legislature Re-Prohibits “Walking Quorums” Under the Texas Open Meetings Act