Tech giants back off from providing facial recognition to police

Three major tech companies have backed away from their facial recognition programs after long-standing questions about privacy and inaccurate results when identifying people of color.

Amazon, Microsoft, and IBM have all pumped the brakes on their facial recognition programs in recent days amid a national debate about racism and the excessive use of force by police. The alleged murder of George Floyd, a black man, by a white Minneapolis police officer on May 25 has led to protests and many companies voicing support for the Black Lives Matter movement.

First, IBM sent a letter to members of Congress saying it would no longer provide “general purpose” facial recognition or analysis software.

“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values,” wrote CEO Arvind Krishna. “We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.”

Two days later, Amazon announced a one-year moratorium on police use of its controversial Rekognition program. “We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” the company said.

Privacy advocates have long criticized Amazon’s facial recognition program, with the American Civil Liberties Union publishing a report in May 2018 saying that the company was quietly selling Rekognition to law enforcement agencies. In July 2018, the ACLU tested Rekognition and found that it misidentified 28 members of Congress, about a third of the people of color in Congress, as criminal suspects.

Then, on June 11, Microsoft announced a moratorium on the sale of its facial recognition technology to law enforcement until there is a federal law regulating its use. Google had previously backed away from facial recognition research, and company employees have opposed its use by police.

Privacy advocates generally applauded the companies’ decisions to back away from selling facial recognition technology to police, although some said it took too long.

“It took two years for Amazon to get to this point, but we’re glad the company is finally recognizing the dangers face recognition poses to black and brown communities and civil rights more broadly,” said Nicole Ozer, technology and civil liberties director with the ACLU of Northern California.

But a one-year moratorium isn’t enough, she added. “Amazon must fully commit to a blanket moratorium on law enforcement use of face recognition until the dangers can be fully addressed, and it must press Congress and legislatures across the country to do the same.”

Isedua Oribhabor, U.S. policy analyst at Access Now, called Amazon’s moratorium a “small step in the right direction.” Amazon needs to do more, she added.

“At a minimum, Amazon must create a cross-functional human rights team and perform human rights impact assessments to truly evaluate the effects that its tech products have on civil and human rights, particularly on the rights of the black and brown communities that have suffered the most from use of its technology,” she said.

Others criticized the companies’ decisions. While misidentification of some racial groups is a problem, clear terms for the use of facial recognition are a better solution than suspending its use, said Stephen Hyduchak, CEO of Aver, an identity verification service.

“This technology already exists, so you cannot go back in time and just wish it out of existence,” he said. “The concerns are legitimate, but the technology is in its infancy.” He also said that instead of bans on its use, tech companies should rethink their privacy rules and be more transparent about its use with customers.

But David Reischer, an attorney and CEO of LegalAdvice.com, said the moratoriums are appropriate. “There needs to be a proper balance between helping the police and guarding against invasion of privacy,” he said.

The companies’ decisions could hurt police investigations in the short term, he said, but “this moratorium is a proper opportunity for all stakeholders to come together and address the serious policy considerations associated with the flaws in the technology,” he added.

Related Content