Navigating continuous change with AI
- By Cal Zemelman
- Dec 16, 2020
In today's IT world, change is constant. While some agencies might be sidelined by change, others will thrive. The key is to understand how to embrace and navigate continuous change through greater visibility and predictability.
One of the most efficient ways to do this is to understand and leverage data already in-house --data that has been collected through existing processes but has yet to be analyzed. By leveraging the latest in explainable artificial intelligence and big data tools that can easily scan petabytes of structured and unstructured data, agency IT teams can rapidly gain critical data-driven insights that will help make enhanced business and operational decisions.
A government agency evaluating grant applications, for example, must research, issue and track each award through its strategic end-goal. While this may seem like relatively basic process, it presents a golden opportunity for agencies to study the data they collect to make far more informed decisions.
For example, where are the applying organizations based? What is the purpose of the grant? Have the organizations applied for grants in the past and were they successful? Many agencies have all this information in-house, but they are not formally analyzing it across the enterprise due to data silos and lack of analytical support. By introducing AI as an automated exploration tool, one agency was able to dive much deeper into the data to make in-depth discoveries -- which informed decisions related to factors that hadn’t been considered.
For example, the agency learned that:
- There was a difference in project success based on when the project started. Those that started in the summer were more successful than those launched in winter.
- Smaller grants proved more successful than larger grants.
- The grantee organization size was important as well, as the agency discovered that a larger, more mature organization was more consistently effective than a smaller one.
Analyzing data that already existed helped this agency better predict the success or failure of certain grant programs and helped determine how and where to invest in the future to gain the most impactful result.
By using data already in-house then allowing AI to make predictions, agencies can more effectively forecast a future state. This goes a long way toward helping them become more efficient and nimble in a fast-changing landscape of continuous change.
Here’s another example: bridge, dam and building inspections. Agencies are using drones or handheld cameras to capture pictures of structures, which are then sent back to the office for review. While humans can analyze that data, they can only go so far and scale so much. AI technologies can analyze the collected data, tie it back to historical information and baselines and make highly accurate predictions on potential safety issues or even predict hotspots before they become safety issues.
How can agencies best accomplish this? Two words: Open source. Specifically, they should consider a platform-as-a-service type of solution that runs open source AI technologies so projects are quick to launch but the agency isn’t locked into one solution in the long term. The best options will have features like automated machine learning to improve productivity and the ability to automatically generate graphical explanations of the results of analysis.
I like to think of it as putting on an Iron Man suit. Agencies take data they already have, then use modern analytical tools to look at things from 100 different angles -- angles humans might never have considered -- and, voila! Human judgment combined with data and you’ve got a superhero.
Cal Zemelman is the executive director of CVP’s data science and engineering practice.