For immediate release: May 11, 2022

303-594-4321

Groups flag that this technology is based on pseudoscience and call on the industry leader in video calls to make sure it doesn’t become the standard.

Fight for the Future and 27 human rights organizations are challenging Zoom to stop its exploration of emotion tracking and analysis AI. Today, the groups sent a letter in response to news published by Protocol in which Zoom is highlighted as developing emotion tracking technology.

The groups hope that this letter (along with a campaign page where individuals can sign a petition to Zoom) will pressure the company to abandon plans to mine users for emotional data points as it is a violation of their privacy and human rights.

This software is discriminatory, manipulative, potentially dangerous and based on assumptions that all people use the same facial expressions, voice patterns, and body language. It is inherently biased and linked to inherently racist and discriminatory practices such as physiognomy. Zoom’s use of this software gives credence to the pseudoscience of emotion analysis which experts agree does not work. Facial expressions can vary significantly and are often disconnected from the emotions underneath such that even humans are often not able to accurately decipher them.

Assessing all of these factors, these groups representing broad constituencies called on Zoom to publicly commit to stopping the use of this software. Read the full letter.

Signers issued the following statements:

“If Zoom advances with these plans, this feature will discriminate against people of certain ethnicities and people with disabilities, hardcoding stereotypes into millions of devices. Beyond mining users for profit and allowing businesses to capitalize on them, this technology could take on far more sinister and punitive uses. It’s not hard to imagine employers and academic institutions using emotion analysis to discipline workers and students perceived to be “expressing the wrong emotions” based on faulty AI.” – Caitlin Seeley George, Director of Campaign and Operations at Fight for the Future.

Large and influential Big Tech companies like Zoom now wield enormous global power (exceeding that of many national governments), and with that power comes the responsibility to respect and not violate privacy, equality, and other human rights. Widespread deployment by Zoom of artificial intelligence for routine “emotion recognition” would represent unrealistic cyberutopian inattention to the risks posed — especially to already marginalized individuals and groups — and would be not only factually inaccurate, misleading, and deceptive, but also discriminatory, intrusive, and downright creepy.” – Chip Pitts, Chair, Advocacy for Principled Action in Government

“Adopting the junk science of emotion detection on the Zoom platform would be a huge mistake. There is zero reliable evidence that a machine can accurately assess someone’s emotional state and a lot of evidence that one-size-fits-all assumptions about “normality” don’t mirror human diversity and punish out-groups for differences. Not to mention that the business of an employer or a teacher is how a person performs in the role and relationship at hand. It’s a choice to disclose feelings and issues in other aspects of a person’s life – and it should stay that way” – Tracy Rosenberg, Oakland Privacy

“In this age of rampant manipulation and harm done by technology and media communications platforms, we at OICD urge Zoom to show leadership through prudent caution instead of rolling out ill-considered new technological features like Emotion AI that might harm the self-determination and identity development of vulnerable minorities by entrenching biases and stereotypes.” – Dr. Bruce White, Executive Director, OICD (Organization for Identity and Cultural Development)

“Our emotional states and our innermost thoughts should be free from surveillance. Emotion recognition software has been shown again and again to be unscientific, simplistic rubbish that discriminates against marginalized groups, but even if it did work, and could accurately identify our emotions, it’s not something that has any place in our society, and certainly not in our work meetings, our online lessons, and other human interactions that companies like Zoom provide a platform for.” – Daniel Leufer, Senior Policy Analyst, Access Now

The organizations signed on to the letter are:

Access Now
ACLU
Advocacy for Principled Action in Government
Defending Rights & Dissent
Državljan D / Citizen D 
Electronic Privacy Information Center (EPIC)
Fight for the Future
Fundación InternetBolivia.org
Global Voices
ΗΟΜΟ DIGITALIS
Jobs With Justice
Kairos
Muslim Justice League
Neon Law Foundation
Oakland Privacy
Open MIC
OpenMedia
Organization for Identity and Cultural Development
PDX Privacy
PEN America
Ranking Digital Rights
RootsAction.org
Secure Justice
Simply Secure
Surveillance Technology Oversight Project
Taraaz
United We Dream
X-Lab