police car (ArtOlympic/Shutterstock.com)

Predictive policing shows promise in Chicago

As police departments across the country experiment with predictive policing technologies, at least one city is seeing some progress.

In Chicago, where city’s overall murder rate has gone up 3 percent, the districts that have implemented the predictive technology have seen a drop in the number of shootings and homicides, according to Reuters.

Three districts saw between 15 percent and 29 percent fewer shootings, and 9 percent to 18 percent fewer homicides, according to Reuters’ analysis of department data. And the 7th District saw a 39 percent drop in shooting in the first seven months of 2017 when compared to the same period last year.

One of the tools Chicago police are using is HunchLab, predictive policing software based on risk terrain modeling, according to The Verge. It takes many different variables into consideration for its model, including crime statistics, bar locations, weather conditions and even lunar phases. These risk factors are mapped to a grid of the city so police can determine where to place their resources.

The use of these predictive policing tools has raised concern with some activists. Organizations like the American Civil Liberties Union say that predictive policing tools will only perpetuate bias within policing. If specific communities have been more heavily policed in the past, then using historic data could show those communities have higher risk.

St. Louis County Police Department took these concerns into consideration when implementing HunchLab by getting the software to surface predictions of serious felonies rather than low-level crimes like drug possession, The Verge reported.

To help project managers and developers mitigate the unintended bias in the algorithms behind predictive programs, the Center for Democracy & Technology has released a digital decisions tool.

The digital decisions tool "translates principles for fair and ethical automated decision-making into a series of questions that can be addressed during the process of designing and deploying an algorithm,” CDT Policy Analyst Natasha Duarte explained in a blog post.

The questions address what data developers use to train an algorithm, the factors or features in the data they should consider, how to test the algorithm and how to ensure fairness.

About the Author

Matt Leonard is a reporter/producer at GCN.

Before joining GCN, Leonard worked as a local reporter for The Smithfield Times in southeastern Virginia. In his time there he wrote about town council meetings, local crime and what to do if a beaver dam floods your back yard. Over the last few years, he has spent time at The Commonwealth Times, The Denver Post and WTVR-CBS 6. He is a graduate of Virginia Commonwealth University, where he received the faculty award for print and online journalism.

Leonard can be contacted at mleonard@gcn.com or follow him on Twitter @Matt_Lnrd.

Click here for previous articles by Leonard.


inside gcn

  • contemplating the future (SFIO CRACHO/Shutterstock.com)

    Governors prep for disruptive technology

Reader Comments

Thu, Aug 10, 2017

Pick a city, any city, and ask a crime reporter, police officer, or citizen to predict where crime will occur this Saturday. I bet their prediction will match predictive policing algorithms. All things being equal, the area that had the worst crime rate last Saturday and the Saturday before that, and the Saturday before that, etc., will continue to have the worst crime rate. Rocket science anyone?

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group