Skip to main content

Are you a legal professional? Visit our professional site

Search for legal issues
For help near (city, ZIP code or county)
Please enter a legal issue and/or a location

AI Researchers Want Amazon to Stop Helping Cops With Facial Recognition Tech

Artificial intelligence (AI), data mining, expert system software, genetic programming, machine learning, deep learning, neural networks and another modern computer technologies concepts. Brain representing artificial intelligence with printed circuit board (PCB) design.
By William Vogeler, Esq. on April 09, 2019 12:58 PM

Dozens of artificial intelligence researchers are calling on Amazon to stop selling facial-recognition software to law enforcement.

In an open letter, the researchers criticized Amazon's "Rekognition" tech after complaints in Orlando and Portland. They emphasized the "poor state of the art of current facial analysis technology." The problem for civil rights, they say, is how the flawed technology can perpetuate race and gender bias.

Amazon's Rekognition

Yoshua Bengio, an award-winner in technological achievement, lent his name to the letter. Other experts include Microsoft's Hal Daume III and Caltech's Anima Anandkumar.

They say Amazon should withhold Rekognition from law enforcement until "legislation and safeguards" are in place to protect civil liberties. Often-biased algorithms are the problem, they said.

For women of color, according to their letter, Amazon's tech had "errors of of approximately 31 percent." They also noted that non-binary genders are not represented in the facial recognition software. "Out of all of the tasks categorized as face recognition or facial analysis, classifying faces into two binary gender options, as performed by Amazon and others' gender classification APIs, is technically simplistic (without accounting for the social complexity)," they wrote.

It wasn't the first time AI experts have questioned the technology. In January, MIT's Deborah Raji and Joy Buolamwini revealed their findings that facial recognition misidentifies women and people with darker skin more often than other subjects. The American Civil Liberties Union has been beating a similar drum for a year.

Microsoft and Google

Microsoft also works with law enforcement, but the researchers said its technology is inaccurate 22 percent of the time when classifying gender. Google reportedly is looking at the problem. The company said it is not ready to supply facial recognition to law enforcement, according to Technology Review. Amazon representatives, for their part, have defended Rekognition. Michael Punke, vice president of public policy, and Matt Wood, general manager of deep learning, said it has many benefits for law enforcement.

Meanwhile, Microsoft, Google, and Amazon are all working with regulators on potential legislation.

Related Resources:

Find a Lawyer

More Options