Privacy No More
By Albert B. Kelly
How would you like to have someone listening in on just
about every word uttered in your home? If you’re like me, you find the thought
chilling and more than troubling. The privacy and sanctity of our homes is
something that dates back to the founding of our country. So valued is the idea
of privacy and protected communications that the courts have carved out the
specific circumstances when it can be breached. Yet for all of that, many of us
willingly bring in “smart speakers” that listen in on and save our every word.
If you doubt that, consider what Washington Post tech
columnist Geoffrey Fowler shared about his own experiences with Amazon’s
virtual assistant Alexa on the Public Radio program “Fresh Air. According to Mr.
Fowler, Alexa did all the normal functions Amazon sells us on such as playing
music, ordering pizza, or searching the internet for recipes; but it also
randomly recorded conversations that had nothing to do with engaging the device.
Fowler knows this because he we went to Amazon’s website ( www.amazon.com /Alexa privacy ) and listened
to approximately 4 years’ worth of recordings
Amazon made in his home- something you can do if you have this device and want
to know. Without prompting, Alexa recorded conversations among family members
about medications, business transactions, and snippets of multiple other
conversations that had nothing to do with engaging the device to fulfill a
specific request or carry out a function.
In addition to simply listening, Amazon keeps the recordings
forever and the only thing you can do about it is go to Amazon’s website and
delete recordings individually- assuming they actually disappear. What you
can’t do is select a setting that tells Amazon to stop recording you and your
family. According to Mr. Fowler, Amazon says that it saves all of our data to
help make Alexa “smarter”. Yet it’s not just Amazon’s Alexa.
Apparently Google Assistant and most likely Microsoft’s
Cortana listen to our conversations as well. Google records everything, but
doesn’t keep the recordings indefinitely. And it’s not just smart speakers in
our homes that listen to us. Apple listens to us through Siri on our iPhones as
well as through HomePod, which is their version of the home speaker. In the
case of Apple, they keep everything they hear but supposedly “anonymize” the
data so that it can’t be connected to an individual. The bottom line is that these
devices listen to everything we say.
You can put Alexa on mute, which might not be a bad idea if
you crave privacy. But it doesn’t stop there. According to Mr. Fowler, we can
be tracked in other ways in the privacy of our homes. As tech guy, Mr. Fowler
has Google’s Nest Thermostat (also connected to Alexa) and what he discovered
was that every 15 minutes, for a period of 6 years, this “smart” thermostat would
send out signals to detect whether a person passed in front of it in his
hallway and sent the information to both Google and Amazon. Much the same
happened with his connected garage door opener and smart light bulbs. Talk
about profiling.
As the world gets more technical, more connected, more
digital and whatever else comes with these changes we have to decide what we’re
willing to live with. We have to decide how much we trust these companies who
use, manipulate, and sell the data they collect about us and our families.
Ultimately, we have to decide how comfortable we are with government having this
data because it translates into power and the temptation to abuse such power is
just too great.
Right now, there really are no laws that protect us as users,
consumers, and citizens. I know many people take the view that they have
nothing to hide which might be true as far as it goes but it’s not a far leap
to think that one day everything from whether or not we get insurance or get it
at an affordable price to securing a mortgage at a decent interest rate will
connect back to the conclusions drawn from our data.
China plans to rank all its citizens based on their
"social credit" by 2020. Citizens can be rewarded or punished
according to these scores. Similar to how we do credit scores, a person's
social score will move up and down according to how their behavior is perceived.
Think something similar can’t happen here?