Facial recognition software used by many law enforcement agencies is prone to racial bias, according to a new report released by the Center on Privacy and Technology at Georgetown Law. 

  • Study: Face recognition system prone to bias
  • Report conducted by Georgetown Law
  • Pinellas Sheriff Bob Gualtieiri defends use of system

The Pinellas County Sheriff's Office is among the largest law enforcement agencies in the country that uses the software. According to the study, the facial recognition database has flawed software and, as a result, is a recipe for false arrests. 

Pinellas Sheriff's Office uses a system called FACES: Face Analysis Comparison Examination System. The system has more than 30 million images in its database shared by more than 30 partner agencies. 

The program allows authorities to quickly check possible suspects' identities without the need to weed through stacks of photos which could take hours or even days. 

But, according to the study, the software is racially biased because it relies on mugshots for input data and because a disproportionate number of blacks are arrested, more are in the system. 

The report says mugshot databases are rarely purged of people who are found innocent. That means they stay in the facial recognition database to be searched over and over again. 

The report also indicated that the software does not do a good job of making out darker faces, which could lead to false hits. 

Despite promises from agencies nationwide that the system would not be abused, the report found that outside the Michigan State Police, no law enforcement agency has released reports about how the system is checked for abuse by officers. 

In Pinellas County, the system is used as many as 5,000 times a month. 

Sheriff Bob Gualtieri, who has defended use of the system, said his agency has not had any issues with the software. Gualtieri is expected to answer questions about the system on Wednesday.