Amazon Announces One Year Pause On Police Use Of Its Facial Recognition Tech

The decision comes amid nationwide scrutiny of police powers following the killing of George Floyd.
LOADINGERROR LOADING

Amazon announced Wednesday it’s implementing a year-long moratorium on law enforcement use of “Rekognition,” its facial recognition technology.

The decision comes amid a nationwide focus on police powers and actions following the May 25 killing of George Floyd, a 46-year-old Black man, by a white officer in Minneapolis.

“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” Amazon said in a statement. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”

It’s unclear how many police agencies use Rekognition, though Amazon has been pitching it to them since at least 2017.

Facial recognition technology represents a powerful law enforcement tool that can be easily abused and lead to mistaken identifications. Critics also argue it effectively amounts to mass government surveillance.

Nicole Ozer, technology and civil liberties director with the American Civil Liberties Union of Northern California, welcomed Amazon’s decision but warned it doesn’t go far enough.

“This surveillance technology’s threat to our civil rights and civil liberties will not disappear in a year,” Ozer told HuffPost in an emailed statement. “Amazon must fully commit to a blanket moratorium on law enforcement use of face recognition until the dangers can be fully addressed, and it must press Congress and legislatures across the country to do the same. They should also commit to stop selling surveillance systems like Ring that fuel the over-policing of communities of color.”

“Face recognition technology gives governments the unprecedented power to spy on us wherever we go. It fuels police abuse. This surveillance technology must be stopped.”

In 2018, an ACLU test of Rekognition found the software falsely identified 28 members of Congress from a database of 25,000 mugshots. Representatives of color were more likely to have been singled out by the system.

A New York Times exposé of a Rekognition competitor called Clearview AI earlier this year also prompted alarm. The company ― which has links to far-right extremists ― surreptitiously scraped information off social media to assemble a database of more than 3 billion photos, then quietly sold access to it to law enforcement agencies across the country.

Close

What's Hot