Amazon + Facial Recognition + Police = What Could Possibly Go Wrong?

In a move that will surprise no one who reads science-fiction, Amazon is now selling a facial recognition tool, called Rekognition, to local police departments, marketing it as a “low cost” way to track persons of interest. According to the company, this tool recognizes “tens of millions of faces” and can pick out “up to 100 faces in challenging crowded photos.” What could possibly go wrong?

Amazon says that by selling Rekognition to police departments, it’s granting agencies the ability to search for “persons of interest” in criminal investigations. For example, if a crime is committed at a public event, and there is surveillance footage of the crime, the police department could cross-reference the face of the suspect with Amazon’s massive database.

Orlando Police Chief John Mina says that his department will be using Amazon’s tool in conjunction with the resources the city already has––such as access to surveillance cameras and mugshot databases––to “provide real-time detection and notification of people of interest.”

In some countries, this tool could be used for serious political repression—imagine what Hungary or the Philippines or China would do with it.

So let’s pretend, just for the sake of argument, that the same government which includes Lois Lerner and Michael Slager will always behave perfectly responsibly. After all, facial recognition isn’t brand new: Federal law enforcement groups, including the Department of Homeland Security, are already utilizing biometric surveillance measures to track immigrants, and Sen. John Cornyn has introduced a bill that would mandate the use of facial recognition for all travelers at border crossings and airports.

What, then, would be the problem with using facial recognition in criminal investigations?

As of 2014, the Federal Bureau of Investigation had some 52 million mugshots in its Next Generation Identification (NGI) database, which includes photos of both criminals and non-criminals. When matching a face, the NGI gives a list of 50 “top matches,” but Techdirt has estimated that this top match list only includes the right person 80 percent of the time. Other facial recognition algorithms match the proper person at higher rates. Facebook’s, for example, claims to be able to match the correct face 97.25 percent of the time.

However, a 2016 study in MIT Technology Review found that statistics from even the most efficient facial recognition tools, such as Facebook’s and Google’s, can be misleading: “[W]hile state-of-the-art face matching systems can be nearly 95 percent accurate on mugshot databases, those photos are taken under controlled conditions with generally cooperative subjects. Images taken under less-than-ideal circumstances, like bad lighting, or that capture unusual poses and facial expressions, can lead to errors.” Surveillance camera footage of a crime committed in the dead of the night, for example, would be less likely to yield a proper face match.

Additionally, facial recognition software has a bad track record of not properly recognizing the faces of African Americans. MIT researchers found in 2018 that the darker a person’s skin color, the higher the likelihood they were misidentified by these devices. Darker skinned men were identified as the wrong gender 12 percent of the time, while lighter skinned men were only misidentified 1 percent of the time. If these devices cannot consistently recognize someone’s gender, how can we expect them to be properly used to recognize a suspect in a criminal investigation?

And the problem here isn’t just that the tools will be ineffective—it’s that the fact of their ineffectiveness will lead to the investigation (or arrest or indictment) of people mistakenly identified by them.

Amazon’s move to partner with law enforcement jeopardizes Americans’ privacy and due process rights. It’s bad enough that the federal government does this sort of thing. Big Tech shouldn’t be giving local governments the same power.

Related Content