Amazon

 

In Smart House, the 1999 classic DCOM (that’s Disney Channel Original Movie for non-millennials), a teenage boy wins a state-of-the-art modern home for his family, fully outfitted with an artificially intelligent smart assistant that has coffee ready for when they wake, cleans the floor after a party, keeps the family dog occupied and even plays music based on mood.

Eighteen years later, that science-fiction ‘90s tale has turned into reality (minus, of course, the Smart House assistant’s level of self-awareness and dubious attempt to hold the family hostage). But the general trajectory — an AI-powered home assistant that learns more about you and your preferences over time — is precisely the point of products such as Amazon Echo or Google Home. And although experts say we’re still far from even the possibility of AI on par with human consciousness, there’s a different — albeit less traditionally frightening — aspect of this tech. That’s the idea that what it hears could, in some extreme cases, be used against you.

Recently, a judge ordered Amazon to turn over recordings from an Echo device in a double-homicide case in New Hampshire, where two women were stabbed to death in January 2017. Prosecutors said they believe the smart home assistant may have recorded audio that could shed light on the deaths, but Amazon officials said the company won’t release any information until a valid legal demand is served. Then there was the widely reported 2015 case where Amazon Echo data was sought in a murder investigation. Prosecutors believed Amazon may have recordings that could shed light on the events that led to police finding a man dead in an Arkansas man’s hot tub. The defendant later voluntarily handed over the recordings, and charges were dropped in 2017 due to a lack of evidence. Theoretically, the government can request evidence from a smart home device in any criminal investigation — arson, auto theft, larceny and more. It’s especially true when it comes to placing someone in any given location, whether to corroborate or disprove an alibi. If you say you weren’t at home one evening, but stored recordings suggest you were in your living room telling Alexa to “please order pizza,” you’ll likely have some questions to answer.

“That’s pretty damning evidence right there,” said Richard Forno, an affiliate of Stanford Law School’s Center for Internet and Society (CIS).

If smart home devices are “always listening,” as many might think, why couldn’t the recordings provide any clarity in the 2015 murder case? To answer that question, it’s important to take a close look at the inner workings of these devices. The idea that they’re “always listening” is only somewhat true. When switched on, smart home assistants such as Amazon Echo and Google Home default to “passive listening mode” — meaning they record at seconds-long intervals and parse the sounds they hear in a process called “device keyword spotting.” Tech companies say the devices only begin recording when they hear their “wake word” (such as “Alexa” or “Okay Google”) — otherwise, they’ll consistently overwrite and discard each fraction of sound they recorded, never sending any of it to the cloud.

There have, of course, been glitches and errors in this practice. In May, an Amazon Echo owned by a couple in Seattle recorded an in-home conversation, then sent it to the phone of an individual from their contact list. In a statement to Recode, Amazon pointed to the incident as a fluke, saying the device woke up to a word sounding like “Alexa,” then interpreted background conversation as a “send message” request to a specific individual. Regardless, the company said it was working to avoid similar strings of events in the future.

The good news: Besides unplugging the device, there are other ways to switch off passive listening mode. Amazon Echo, for example, offers a “mute” button that signals the device to stop listening for a wake word, and that mode is denoted by a red ring of light. Inside each device, there’s a single wire that powers either the microphone or the red light — and when the red light is on, there’s a physical protection against the microphone being active.

The reason experts say these features are largely effective? It’s not just a software protection (which, due to the malleability of computer code, could theoretically be tampered with or hacked more easily). Instead, it’s a “much more robust” hardware protection, said John Verdi, vice president of policy at the Future of Privacy Forum (FPF). The physical feature built into the device — the ability to either power the red light or the microphone, but not both — would likely require physical tampering and couldn’t be done via a remote software hack. That’s why it’s important to take note of any smart device’s hardware protections as well as its software security.

If you’re concerned about the information your device has on you, most offer users the ability to listen to — and permanently delete — stored recordings. A Google Home user can click “Google Account” underneath their profile pictures in a web browser, then on the next page, click “Manage your Google activity” (under the “Personal info & privacy” section), followed by “Go to my activity” (under the “Review activity” section). You can delete queries individually or, to delete all stored queries at once, select the “Delete activity by” link on the left side of the page, then click “all time.” Amazon Echo users, on the other hand, can delete stored information via the Alexa app by clicking “Settings” under the hamburger icon, then choosing “History.” You’ll be able to listen to and delete individual recordings. To delete them all simultaneously, head to Amazon’s Manage Your Content and Devices page. (Tech companies warn that user experience — such as vocal recognition and personalization — may be impacted by deleting stored information.)

Besides cases when a crime is suspected of being committed in or around a home, there’s another scenario in which the government may be able to legally access your Amazon Echo or Google Home data — and that’s in the event of exigent circumstances. That means if police have reason to believe that there’s a crime in the process of being committed — a burglary, a hostage situation, armed trespassing — they can request that a tech company bypass the typical legal process (warrant or court order) to quickly grant access to a live feed inside a home or business, Verdi said. The company can then either voluntarily cooperate or decline. Absent these exigent circumstances, a piece of legislation called the Electronic Communications Privacy Act stipulates when companies can and cannot turn over user data.

If you’ve got a smart home with a number of connected devices — beyond just an Amazon Echo or Google Home — there are other data privacy issues to consider. Smart home assistants “tend to be better secured than some of the other connected devices that are out there,” Verdi said. Amazon and Google both have relatively robust cybersecurity arms and patch their connected devices’ vulnerabilities regularly. But when it comes to Internet of Things (IoT) devices — such as “smart” thermostats, refrigerators, security systems, toys, coffee makers and more — they’re traditionally more vulnerable to hackers and may even be more susceptible to overbroad government data requests. “Most of these devices aren’t designed with security in mind,” Forno said.

In the U.S., privacy protections aren’t as robust when it comes to metadata — account information about a user’s relationship with a platform (for example, how often you turn on your smart television per day and which hours it’s normally in use). That information can be accessed by the government via a subpoena, which has a “lesser standard of proof than a warrant or court order,” Verdi said. Information like that can be used to target consumers even more specifically — based on assumed socioeconomic status, habits and routines. The data we generate as users can provide companies with an “intimate profile” of our living patterns — which type of milk we use, how often we buy it, what time we get up and how often we make coffee — that can be monetized for commercial purposes, Forno said. There may come a day, for example, when smart refrigerator users who buy “healthier” groceries are offered discounted health insurance.

“Anything smart,” Forno said, “usually means it’s being smartly used against you.”

Source: Entrepreneur.com

57f51cdf5d783 bpfull
+ posts

Amenorhu kwaku is an author, internet marketer, and entrepreneur. He is the founder of SuccessValley, a network community for students and aspiring entrepreneurs. He is also the founder of Republik City News and Whoop, a news portal and a business directory.

©2024 SuccessValley| All Right Reserve

CONTACT US

We're not around right now. But you can send us an email and we'll get back to you, asap.

Sending

Log in with your credentials

or    

Forgot your details?

Create Account