FOR IMMEDIATE RELEASE: August 14, 2019
Contact: Evan Greer, (508) 368-3026, press@fightforthefuture.org

Yesterday, Amazon announced that they had updated Rekognition, the facial recognition software they’re aggressively marketing to immigration and law enforcement agencies, with new capabilities. Among them: the company now claims their face scanning algorithm can detect “fear.” 

It’s notable how casually Amazon makes this announcement––in a one paragraph blog post––like any other routine software update with a few new features. The implications for human rights are staggering. Face scanning surveillance that claims to be able to detect people’s emotions, inner thoughts, or intentions is an authoritarian government’s dream come true. In the hands of law enforcement, such a tool could be deadly.

“Amazon is going to get someone killed by recklessly marketing this dangerous and invasive surveillance technology to governments,” said Evan Greer, deputy director of Fight for the Future (pronouns: she/her). “Facial recognition already automates and exacerbates police abuse, profiling, and discrimination. Now Amazon is setting us on a path where armed government agents could make split second judgements based on a flawed algorithm’s cold testimony. Innocent people could be detained, deported, or falsely imprisoned because a computer decided they looked afraid when being questioned by authorities. The dystopian surveillance state of our nightmares is being built in plain site –– by a profit-hungry corporation eager to cozy up to governments around the world.”

It’s incredible that Amazon is touting these new features for their face scanning software when the current ones don’t seem to work at all. Their announcement claiming Rekognition can now smell your “fear” came on the same day that the ACLU revealed they had used the software to compare California lawmakers to a mugshot database –– and it incorrectly matched 1 in 5 of them with photos of criminals. The ACLU previously used Amazon Rekognition to do the same thing with members of Congress, with similar results, and the majority of those misidentified as criminals were lawmakers of color. The Orlando Police Department, one of the only departments in the country that has publicly acknowledged it was testing Amazon’s facial recognition surveillance product, abandoned the program, basically saying that it wasn’t workable. 

Amazon’s announcement comes amid growing backlash to facial recognition surveillance that has been spreading across the country. Last month Fight for the Future launched our BanFacialRecognition.com campaign, along with  an interactive map showing where in the US facial recognition surveillance is being used, and also where there are local and state efforts to ban it. San Francisco, Somerville, MA, and Oakland, CA, recently became the first cities in the country to ban the technology. Berkeley, CA and Cambridge, MA are also considering bans, and bills to halt current use of the tech are moving in the Massachusetts and Michigan legislatures. In Congress, there is growing bipartisan agreement to address the issue, but it could easily stall under pressure from law enforcement and big tech.

Fight for the Future opposes attempts by the tech industry (including Amazon)  and law enforcement to pressure Congress to pass an industry-friendly “regulatory framework” for facial recognition that would allow this dangerous technology to spread quickly with minimal restrictions intended to assuage public opposition. But we support narrower efforts to ban or restrict specifically egregious uses of this surveillance, such as a bill introduced recently to ban the use of facial recognition in public housing. For more on our position, read our op-ed in Buzzfeed News: “Don’t regulate facial recognition. Ban it.”

###