Two weeks ago, we asked if you thought voice-activated A.I. queries made via Alexa, Siri, Google Assistant, or Microsoft's Cortana should be heard by humans at (or contracted by) big tech companies. The result are in... and your voice is resolute on this one.

Some 86 percent of respondents say no human should ever hear a user’s voice query.

We posed this question in the wake of news suggesting Apple was allowing contractors to listen to Siri queries. Looking at the landscape now, all four major tech companies offering voice-activated A.I. assistants (Amazon, Google, Apple, and Microsoft) have been found soliciting human contractors to listen to our voice queries. Why? In theory, a human being can understand nuances in language that a bot cannot; these companies want that analysis to further improve their assistants' language abilities.

Naturally, these big tech companies toe the line when it comes to privacy: All claim that the requests heard by humans are anonymized, and meant only to improve the service, not spy on users. Still, some alarming things can be overheard via voice-activated search. (Consider a heated argument with your partner where you said a whole lot of things in a fit of rage you definitely didn’t mean; anonymized or not, it’s an embarrassing thing to have floating around out there, heard by strangers.)

Part of the issue is some data is never truly anonymized. In 2017, Amazon agreed to hand over Alexa data in a murder case. While the accused agreed to have this data submitted as evidence in order to exonerate him, the very presence of such data is worrisome in 2019, when we're generally more aware of the broader privacy implications of such things.

For example, if a contractor overheard something illegal happening on an Alexa clip, could Amazon cross-reference the recording against its database of personalized queries, discover the user's identity, and alert the police? The company has already shown a willingness to become Big Brother; it is striking deals with cities and users to provide Ring doorbell cam footage to local police departments.

Such ethical dilemmas were foreseeable long before this news broke. The public had no idea such listening was taking place, and it’s reasonable to doubt the intentions of companies who are in the practice of gathering the data. We also didn’t know Facebook was allowing companies such as Cambridge Analytica to run amok. What you don't know can still impact your.

Apple has since been sued for violating user privacy in recording Siri data. The U.S. Government is also joining European lawmakers to examine how companies are handling voice queries and associated data.

These A.I. platforms are all a push into ambient computing, which is the idea that we’ll simply talk to computers, rather than using keyboards and other physical input devices, in order to complete a great number of tasks. And it’s easier to speak your command than ask your brain to spin up some finger muscles, but this urge to use contractors to listen to data is nonetheless part of an arms race to commandeer marketshare for ambient computing. Our uneasiness about it likely comes from the memory of how Google ended up effectively becoming spyware for ads, and Facebook strip-mined our data for revenue, and Amazon used searches on its site to filter ads everywhere.

Ambient computing doesn’t yet have ads, but casual remarks made in the presence of an always-listening device could end up used in the course of data-mining. Stay aware.