justice data

The push to make local justice data shareable

State and federal agencies have been prioritizing access to criminal justice data at the local -- especially county -- level in an effort to improve their understanding of issues and make data driven decisions, said Debbie Allen, the chief justice planning officer for the Criminal Justice Coordinating Council of Adams County, Colo.

That pressure has forced counties to think about how to consolidate and integrate their data with partners at different agencies and levels of government, she said at the National Association of Counties Legislative Conference on March 6.

An important first step in integrating data from multiple, distinct locations, like a jail management system and local court data -- is governance, which means getting people on board, finding shared priorities and deciding on research questions, Allen said. In Adams County, she said, those questions included, “How do we reduce the overutilization of jail?” and “How do we ensure the proper resources for the incarcerated population are available in the community?"

Sarah Davis, a data analyst at Pima County, Ariz., Behavioral Health, said starting small can help get buy-in to the process. In Prima County this meant automating a process to collect and medical and behavioral treatment history from an individuals' electronic health records when they were booked into jail, Davis said.

“Success begets success,” she added.

But once the priorities are laid out, the research questions are in place and “everyone is singing kumbaya” then begins what can be a months' long process of actually bringing the data together, according to Andrew Owen, the executive director of the Open Justice Broker Consortium (OJBC), a collaboration of governments dedicated to improving justice information sharing through the reuse of low-cost, standards-based integration software.

The process starts with going to the specific agencies that want to share data and assessing their technical architecture.

“Normally one of the harder parts of the projects is to work with the agencies to say, ‘This is the data that we need, tell us what your system supports,’” Owen said in an interview.

Owen and his team send agencies a questionnaire asking how their data is stored, the kind of database being used and if can it be queried directly or by application programming interfaces.  

“A lot of times they might have existing interfaces, existing APIs you can work with,” he said. “If they don’t, then we have to understand the effort that’s required" to access the data and make it sharable.

OJBC tries to conform to common data standards for information exchange, but some legacy systems won’t support that. In that case, Owen said, OJBC writes “what we call connectors or adaptors” to make legacy data consistent with the desired standards.

That part of a data integration project can take up to a year, depending on the specifics of the effort, he said.

OJBC also provides open source tools for data analysis and visualization as part of a toolkit. Some of the resources are pulled from other organizations, like Apache, but OJBC also does some of its own software development. It publishes its tools on GitHub.

Although risk-adverse agencies may have avoided open source solutions in the past, officials must start getting more comfortable with the idea of open source, Owen said.

“[Open source] is becoming more and more prevalent in the government space and it's just, to me, imperative,” he said.

About the Author

Matt Leonard is a former reporter for GCN.

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.