Prosecutors dropped the case less than two weeks later, arguing that officers had relied on insufficient evidence. Police chief James Craig later apologized for what he called “shoddy” investigative work. Williams, who said he had been driving home from work when the 2018 theft had occurred, was interrogated by detectives and held in custody for 30 hours before his release.
Williams’ case sparked a public outcry about the fast-growing police use of a technology that research has shown often misidentifies people of color. His lawsuit is at least the third in the U.S. brought by Black men to raise doubts about the software’s accuracy.
The case could heighten the legal challenges for a technology that is largely unregulated in the U.S., even as it has become a prolific investigative tool used by police forces and federal investigators. While the software has been banned by more than a dozen cities nationwide, it has been cited in a growing number of criminal cases, including in the landmark investigation of rioters at the Capitol on Jan. 6.
Williams’ attorneys did not make him available for comment Tuesday. But Williams wrote in The Washington Post last year that the episode had left him deeply shaken, in part because his young daughters had watched him get handcuffed in his driveway and put into a police car after returning home from work.
“How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway?” he wrote. “As any other black man would be, I had to consider what could happen if I asked too many questions or displayed my anger openly — even though I knew I had done nothing wrong.”
Detroit police spokeswoman Sgt. Nicole Kirkwood said the department does not comment on pending litigation. But she pointed to comments from Craig last year in which the police chief said the case’s failures “had nothing to do with technology, but certainly had everything to do with poor investigative work.”
“This was clearly sloppy, sloppy investigative work. There’s no other way for me to say it but that way,” Craig told a Detroit Board of Police Commissioners meeting last June that is cited in the lawsuit. “If you just rely solely on facial recognition technology, there’s a high probability that it’s going to misidentify.”
Kirkwood told The Post in a statement last year that the department does not make arrests based solely on a facial recognition search and that, in Williams’ case, investigators had reviewed video, interviewed witnesses, conducted a photo lineup and submitted evidence to prosecutors, who recommended charges against Williams for first-degree retail fraud.
Wayne County prosecutors later said the facial recognition result was not enough evidence to bring charges and that the store security official shown the photo lineup hadn’t been in the store during the crime.
“This case should not have been issued based on the DPD investigation, and for that we apologize,” prosecutor Kym L. Worthy said in a statement. “Thankfully, it was dismissed on our office’s own motion. This does not in any way make up for the hours that Mr. Williams spent in jail.”
Williams’ identification as the thief happened after Detroit detectives sent a blurry, dimly lit image from a surveillance camera to the Michigan State Police, which ran a facial recognition search that pointed to Williams’ old driver’s license photo as a possible match.
But the state police’s “investigative lead report” also said, in all capital letters, that the document was not a positive identification or sufficient probable cause for an arrest. The detective nevertheless submitted the photo to prosecutors as evidence to support an arrest warrant.
The civil suit argues that Williams’ rights were violated under the Fourth Amendment, which bans “unreasonable” police searches, as well as a state civil rights law prohibiting racial discrimination. The lawsuit seeks an unspecified amount for damages as well as policy changes for the Detroit police department, which continues to use the software.
Williams is being represented by student attorneys at the University of Michigan Law School’s Civil Rights Litigation Initiative as well as lawyers from the American Civil Liberties Union and the advocacy group’s Michigan affiliate.
One of the student attorneys, Jeremy Shur, said Tuesday that Williams’ daughters, ages 3 and 7, have been “traumatized” by the incident. “When they see police, they wonder if they’re taking Daddy away,” Shur said.
The software’s accuracy is heavily dependent on image quality: Blurry, grainy or dark photos often lead to poor results. But even the algorithms used in a facial recognition search can offer a wide range of effectiveness: Several of those tested in a 2019 federal study were up to 100 times more likely to misidentify the face of an African-American or Asian person, compared to a White person.
Williams’ lawsuit is the second accusing Detroit police of making a false facial recognition match: In September, a 26-year-old man named Michael Oliver sued the department, saying his false arrest on a 2019 larceny charge led him to lose his job and spend three days in jail.
The same detective, Donald Bussa, investigated both Oliver and Williams and is named in both lawsuits. Craig, the Detroit police chief, has criticized Bussa’s use of a “blurry” photo and said the department has worked to change the facial recognition policies that led to the arrest.
In a third lawsuit, filed in January, a man named Nijer Parks sued New Jersey police and prosecutors, saying he was held in jail for 10 days after he was falsely accused of stealing from a hotel gift shop in 2019. All three cases are ongoing.
Defenders of the technology said it should be used solely to generate leads for police, not as the lone piece of evidence, and that officers should not rely too heavily on its results or apply it to every low-level crime. The Detroit department’s policy has since been changed to allow the use of facial recognition software only in cases of violent crime.
But critics argue that officers who put too much trust in the systems’ findings — or who alter the search images in hopes of achieving better results, as researchers have found evidence of in some police departments — could end up placing the burden of proof on innocent people who may not be told what investigative techniques were used as the basis for their arrest.
Both the Detroit and Michigan state police have a contract with a South Carolina-based company, DataWorks Plus, that makes facial recognition software. The company did not immediately respond to requests for comment.
The Detroit department is also among hundreds of police agencies that have used Clearview AI, a facial recognition tool that searches through a large database of photos taken from across the Internet, according to a BuzzFeed News report earlier this month based on data from a confidential source. Neither the Detroit police nor Clearview have confirmed the report, and it does not appear Clearview was used in Williams’ case.
Clare Garvie, a senior associate at Georgetown Law’s Center on Privacy and Technology who has studied facial recognition software and is not involved in the lawsuit, said the lawsuits have helped shed light on investigative and technological breakdowns that would otherwise remain unseen.
But she expressed concern that slow court proceedings and a patchwork approach to regulation could lead to more cases of facial recognition misidentification before the existing damage could be addressed.
She also worried that the cases shifted the costs of police failures to the people who had been falsely identified — and to the general public, who both live in fear of false arrests and end up paying to defend or settle the cases in court.
“There’s the burden of somebody after the fact, who’s already been injured by a misidentification, to inform the public of what happened to them,” she said. Then “the taxpayer bears the burden of the mistake.”