When Facial Recognition Is Used to Identify Defendants, They Have a Right to Obtain Information About the Algorithms Used on Them, EFF Tells Court

We urged the Florida Supreme Court yesterday to review a closely-watched lawsuit to clarify the due process rights of defendants identified by facial recognition algorithms used by law enforcement.

Specifically, we told the court that when facial recognition is secretly used on people later charged with a crime, those people have a right to obtain information about how the error-prone technology functions and whether it produced other matches.

EFF, ACLU, Georgetown Law’s Center on Privacy & Technology, and Innocence Project filed an amicus brief in support of the defendant’s petition for review in Willie Allen Lynch v. State of Florida. Prosecutors in the case didn’t disclose information about how the algorithm worked, that it produced other matches that were never considered, or why Lynch’s photo was targeted as the best match. This information qualifies as “Brady” material—evidence that might exonerate the defendant—and should have been turned over to Lynch.

We have written extensively about how facial recognition systems are prone to error and produce false positives, especially when the algorithms are used on African Americans, like the defendant in this case. Researchers at the FBI, MIT, and ProPublica have reported that facial recognition algorithms misidentify black people, young people, and women at higher rates that white people, the elderly, and men.

Facial recognition is increasingly being used by law enforcement agencies around the country to identify suspects. It’s unfathomable that technology that could help to put someone in prison is used mostly without question or oversight. In Lynch’s case, facial recognition could help to send him to prison for eight years.

Undercover police photographed Lynch using an older-model cell phone at an oblique angle while he was in motion. The photo, which is blurred in places, was run through a facial recognition algorithm to see whether it matched any images of a database of county booking photos. The program returned a list of four possible matches, the first of which was Lynch’s from a previous arrest. His photo was the only one sent on to prosecutors, along with his criminal records.

The algorithm used on Lynch is part of the Face Analysis Comparison Examination Systems (FACES), a program operated by the Pinellas County Sheriff’s Office and made available to law enforcement agencies throughout the state. The system can search over 33 million faces from drivers’ licenses and police photos. It doesn’t produce “yes” or “no” responses to matches; it rates matches as likely or less likely matches. Error rates in systems like this can be significant and the condition of Lynch’s photo only exacerbates the possibility of errors.

FACES is poorly regulated and shrouded in secrecy. The sheriff said that his office doesn’t audit the system, and there’s no written policy governing its use. The sheriff’s office said it hadn’t been able to validate the system, and “cannot speak to the algorithms and the process by which a match is made.”

That he was identified by a facial recognition algorithm wasn’t known by Lynch until just days before his final pretrial hearing, although prosecutors had known for months. Prior to that, prosecutors had never disclosed information about the algorithm to Lynch, including that it produced other possible matches. Neither the crime analyst who operated the system or the detective who accepted the analyst’s conclusion that Lynch’s face was a match knew how the algorithm functioned. The analyst said the first-listed photo in the search results is not necessarily the best match—it could be one further down the list. An Assistant State Attorney doubted the system was reliable enough to meet standards used by courts to assess the credibility of scientific testimony and whether it should be used at trial. Lynch asked for the other matches produced by FACES—the court refused.

If a human witness who identified Lynch in a line-up said others in the line-up also looked like the criminal, the state would have had to disclose that information, and Lynch could have investigated those alternate leads. The same principle should have required the state to disclose other people the algorithm produced as matches and information about how the algorithm functions, EFF and ACLU told the Florida Supreme Court.

When defendants are facing lengthy prison sentences or even the death penalty, tight controls on the use of facial recognition are crucial. Defendants have a due process right to information about the algorithms used and search results.  The Florida Supreme Court should accept this case for review and provide guidance to law enforcement who use facial recognition to arrest, charge, and deprive people of their liberty.


Tuesday 12th March 2019 4:22 pm

Back to Deeplinks blog