Experts remain divided on facial recognition technology despite another wrongful arrest
The City of Detroit is being sued for falsely arresting a 32-year-old mother of three in February on carjacking charges after police used facial recognition technology in their investigation.
Porcha Woodruff is the sixth person to report being wrongly accused and arrested in connection to a crime using the technology.
Woodruff is suing the city and Det. LaShauntia Oliver — in his professional and individual capacities — on counts of false arrest and imprisonment, malicious prosecution, violation of public accommodation and public service and intentional infliction of emotional distress.
The suit did not cite a specific monetary amount being sought by Woodruff, who was eight months pregnant at the time of her arrest. She was released from jail after posting a $100,000 bond. The ordeal led her to have stress-related contractions.
In a written statement, Detroit Police Department Chief James E. White said, “I have reviewed the allegations contained in the lawsuit. They are very concerning. We are taking this matter very seriously, but we cannot comment further at this time due to the need for additional investigation.
“We will provide further information once additional facts are obtained and we have a better understanding of the circumstances.”
It’s not the first time the police department has come under fire for such an arrest since it began using facial recognition technology in 2019.
In 2020, Detroit authorities wrongfully arrested Robert Williams for allegedly stealing thousands of dollars’ worth of watches. A year earlier, authorities falsely arrested then-26-year-old Michael Oliver for allegedly reaching into a teacher’s car, grabbing a cellphone and chucking it, damaging the device.
All of the plaintiffs in these cases have one thing in common: They’re all Black. Researchers and critics have long argued that facial recognition technology is increasingly likely to harm people from vulnerable communities.
In 2022, researchers at Georgia State University found that police departments that used it disproportionately arrested Black people.
Research conducted by scholars from the Massachusetts Institute of Technology and Microsoft in 2018 concluded that some algorithms wrongly identified Black women almost 35 percent of the time but correctly identified white men nearly 100 percent of the time.
The technology’s unreliability has led cities across the country to ban law enforcement departments from utilizing it altogether. In 2019, San Francisco’s Board of Supervisors voted 8-to-1 to prohibit its police department from using it.
Lawmakers in about a dozen communities around the nation have made similar decisions.
Among advocates, there is concern about the lack of oversight police departments have after getting their hands on the technology.
“I think the case we hear made for the ban is that police can’t be trusted to use their technology in a way that won’t harm people’s civil rights and civil liberties,” Katie Kinsey, chief of staff of The Policing Project at New York University School of Law, said.
The organization works to promote public safety through transparency, equity and democratic engagement.
On the legal side, there has been debate about whether the use of such technology in police investigations could be a violation of the Fourth Amendment. Under the law, people in the U.S. are protected against unreasonable searches and seizures.
The courts have yet to take up that concern, however.
Nathan Freed Wessler, deputy director of ACLU Speech, Privacy and Technology Project, said that his organization is in favor of banning police departments from implementing the use of facial recognition technology.
Even though departments are being sued for using it, there continues to be interest in its adoption. In 2018, authorities in Annapolis, Md. used facial recognition to identify the person responsible for the shootings at The Capital Gazette newsroom.
“At the very least, we need lawmakers in Congress and state legislatures to sit down and take a very serious look at this technology and how it’s being used and figure out what the guardrails are,” Wessler said.
Still, advocates differ on how to address concerns. For Kinsey, her organization believes that communities should be able to make their own choices about how they want law enforcement agencies implementing it.
If a community wants to implement the technology, then it should be regulated, she argued.
“Folks feel like policing is often not keeping those communities safe,” Kinsey added, speaking generally about the issue. “And so why give a more powerful tool to this agency when we haven’t solved some of the harm that it’s caused already?”