As municipalities around the country roll back their use of facial recognition technology, the California city is the first to ban the use of predictive policing.
As municipalities around the country roll back their use of facial recognition technology, one California city has banned the use of predictive policing.
Santa Cruz was one of the first cities to pilot a predictive policing solution from PredPol, a local company that pioneered the technology. It uses algorithms to link key aspects of offender behavior, such as repeat victimization and location, to predict how crime patterns will evolve.
Machine learning working with large datasets enables PredPol to predict “where and when crimes are most likely to occur, using just three data points: type of crime, location of crime, and date/time of crime,” the company said on its blog. PredPol also uses GPS and vehicle location tracking data where it’s available, but it does not use personally identifiable information or demographic information, which the company says provide greater transparency and avoids profiling and privacy concerns.
Critics say the technology perpetuates police bias by sending patrols back to areas where they’ve already made arrests.
In 2011, Santa Cruz began testing the technology with historical crime data, and the algorithm predicted about one-third of the crimes within a given location. In 2017, officials put a moratorium on the use of predictive policing technology. Now, it looks like it will be the first city decided to officially ban the use of both predictive policing and facial recognition technology because they can be “disproportionately biased against people of color,” as Mayor Justin Cummings said in a Reuters report. A final vote by the city council will take place Aug. 11.
“Predictive policing could have been effective if it had been used to work with community members to solve problems — that didn’t happen,” Police Chief Andy Mills told the LA Times. Instead, he said, the policy was used purely for enforcement, which led to unavoidable conflicts.
The Santa Cruz Sentinel reported that the pending ban prohibits use of both technologies, unless police get explicit approval from the City Council via a resolution that is based on “findings that the technology and the data that informs the technology meets scientifically validated and peer reviewed research, protects and safe guards the civil rights and liberties of all people, and will not perpetuate bias.”
PredPol said it supported the city resolution’s requirement that predictive policing not perpetuate racial bias.
“Any government agency that applies technology to its operations should have a process to ensure that it does not result in racially inequitable outcomes,” PredPol CEO Brian MacDonald wrote in an email to the Los Angeles Times. Because company is confident its software is not racially biased, it meets the conditions in the city’s ordinance, he said.