For immediate release: May 10, 2022

978-852-6457

To Eric S. Yuan
Founder & Chief Executive Officer, Zoom

Dear Mr. Yuan,

We are writing to you as a group of organizations concerned about Zoom’s exploration of emotion tracking software. Zoom claims to care about the happiness and security of its users but this invasive technology says otherwise.

This move to mine users for emotional data points based on the false idea that AI can track and analyze human emotions is a violation of privacy and human rights. Zoom needs to halt plans to advance this feature.

In particular, we are concerned that this technology is:

  • Based on pseudoscience: Experts admit that emotion analysis does not work. Facial expressions are often disconnected from the emotions underneath, and research has found that not even humans can accurately read or measure the emotions of others some of the time. Developing this tool adds credence to pseudoscience and puts your reputation at stake.
  • Discriminatory: Emotion AI, like facial recognition, is inherently biased. It has connections to practices like physiognomy which have been proven to be misleading, erroneous, and racist. These tools assume that all people use the same facial expressions, voice patterns, and body language—but that’s not true. Adding this feature will discriminate against certain ethnicities and people with disabilities, hardcoding stereotypes into millions of devices.
  • Manipulative: You are already marketing sentiment analysis as a way for businesses to capitalize on users and close deals faster. Offering emotion analysis to monitor users and mine them for profit will further expand this manipulative practice.
  • Punitive: The use of this bad technology could be dangerous for students, workers, and other users if their employers, academic or other institutions decide to discipline them for “expressing the wrong emotions” based on the determinations of this AI technology.
  • A data security risk: Harvesting deeply personal data could make any entity that deploys this tech a target for snooping government authorities and malicious hackers.

In the past, you have made decisions that center users’ rights, like when you changed your mind about blocking free users from your encrypted service and again when you canceled face-tracking features because they did not meet privacy standards.

This is another opportunity to show you care about your users and your reputation. Zoom is an industry leader, and millions of people are counting on you to steward our virtual future. As a leader, you also have the responsibility of setting the course for other companies in the space. You can make it clear that this technology has no place in video communications.

As you continue to grow, it is critical that you maintain a relationship of trust and respect with your users. We ask you to publicly respond to our request and commit to not implementing emotion AI by May 20, 2022.

Signed,

Access Now
ACLU
Advocacy for Principled Action in Government
Defending Rights & Dissent
Državljan D / Citizen D 
Electronic Privacy Information Center (EPIC)
Fight for the Future
Fundación InternetBolivia.org
Global Voices
ΗΟΜΟ DIGITALIS
Jobs With Justice
Kairos
Muslim Justice League
Neon Law Foundation
Oakland Privacy
Open MIC
OpenMedia
Organization for Identity and Cultural Development
PDX Privacy
PEN America
Ranking Digital Rights
RootsAction.org
Secure Justice
Simply Secure
Surveillance Technology Oversight Project
Taraaz
United We Dream
X-Lab