They can hear you: 6 ways tech is listening to you

Voice recognition technology often violates human rights, and it’s popping up more and more. Recently EDRi's member Access Now called out Spotify for developing voice recognition tech that claims to be able to detect gender and emotional state, among other things. But it’s not just Spotify. Some of the most powerful companies in the world are deploying similar abusive tech because harvesting data about you is profitable. The market for voice recognition is growing, expected to be worth a whopping $26.8 billion by 2025.

By Access Now (guest author) · July 14, 2021

The article was first published here by EDRi’s member Access Now.

Contribution by: Jennifer Brody, U.S. Advocacy Manager & Leanna Garfield, Digital Engagement Coordinator, EDRi member, Access Now

 

Voice recognition technology often violates human rights, and it’s popping up more and more. Recently EDRi’s member Access Now called out Spotify for developing voice recognition tech that claims to be able to detect gender and emotional state, among other things.

But it’s not just Spotify. Some of the most powerful companies in the world are deploying similar abusive tech because harvesting data about you is profitable. The market for voice recognition is growing, expected to be worth a whopping $26.8 billion by 2025.

This is not an exhaustive list, but below are some dangerous examples:

DANGEROUS EXAMPLES OF HOW TECH COMPANIES CAN LISTEN TO YOU

McDonald’s: recognizing your voice at drive-thrus

McDonald’s has been testing voice recognition tech in 10 drive-thru locations in and around Chicago. The company says it uses the tech to identify your “age, gender, accent, nationality, and national origin.” In June, a customer at an Illinois McDonald’s filed a lawsuit, saying that it violated state law by using voice recognition to take his order without consent.

Amazon Halo Wristband: monitoring the tone of your voice to infer how you feel

Through its Halo Wristband products, Amazon claims it can detect “positivity” and “energy” in your voice to help you improve your communication skills. Creepy much? Tone policing is here.

TikTok: collecting your “voiceprints”

In June, the popular social media app TikTok updated its privacy policy to let users know it may collect “voiceprints” to help the company with its “demographic classification,” which could include everything from race to income to gender to marital status.

Call Centers: know if you’re agitated

Some call centers are using AI to try to detect your emotions, and may connect you to a representative best equipped to sell you a certain product based on your “angry” feelings.

Samsung: your smart fridge is recording you

In some of its smart fridge models, Samsung likely uses recordings of your voice (that it may store on its servers) to sell you more products.

HireVue: when your voice decides if you get a job

Following an algorithmic audit, and intense public pressure from civil society, AI hiring company HireVue finally dropped facial recognition from its software. But, while this means they won’t use facial recognition to decide if you get hired, nothing is stopping them from using their voice recognition software to make equally problematic inferences about your suitability based on how you talk.

 All of this is alarming, unwarranted, and, in certain jurisdictions, illegal. Using voice recognition tech to make inferences about us invades our private lives and reinforces harmful, regressive stereotypes.

We’re keeping an eye on this emerging tech, and calling out companies to hold them accountable. You deserve respect, not exploitation.

And Access Now are not the only ones paying attention. With EDRi and other partners from around the world, Access Now launched a campaign to ban biometric surveillance and a call to outlaw automated recognition of gender and sexual orientation.

Spread the word. RT about the proliferation of this dangerous tech here.

We shouldn’t have to worry about our smart refrigerators, voice assistants, and apps with a microphone listening to us, profiling us, and trying to read our minds.

Image credit: Access Now