5 steps to accelerating insights with predictive analytics
- By Steve Orrin
- Mar 29, 2019
Analytics come in two basic forms. One helps federal IT professionals learn from the past, while the other helps them lay groundwork for the future.
The former, operational analytics, provides providing a wide spectrum of historical insights on what happened, which can be helpful in understanding why some things worked and others did not. Many agencies use operational analytics to find out why a piece of equipment went down, why a network slowdown occurred or what was behind a network intrusion.
But the latter, predictive analytics (or advanced analytics), is arguably more effective in giving IT professionals the insight they need to meet their agencies’ objectives. For example, predictive analytics can help them understand what it will take to modernize networks and storage and compute infrastructures to better handle data movement and workload processing. Optimizing predictive analytics can lead to better decision-making, which can impact everything from troops in theater to disaster response.
Here are five strategies federal IT professionals can adopt to make the leap from a purely operational approach to a cohesive predictive analytics solution.
1. Identify the objectives
To begin the transformation, agency administrators should ask tough, high-level questions about the problems they want to solve. How can we deploy more troops faster? How can we get resources to disaster victims sooner? How can we prevent fraud? How can we better predict weather patterns that may require government response? The answers to these questions help determine what kind of insights teams are seeking and which technologies to use.
It’s important to have a cross-section of people asking these questions. Diversity of thought plays a role here, and leaders should rely on mission/line of business owners, data scientists and executives alike. The team must fully comprehend the organization’s most pressing mission objectives and how data science can be leveraged to predict and solve those challenges.
2. Recognize cultural and technical challenges
When it comes to predictive analytics, more data is generally better. However, information sharing can prove to be a challenge, particularly in the federal government where data is typically siloed. Those silos must be dismantled, and it must become culturally acceptable to share information and use outside data when possible.
A key point is making sure the team has access to the right data to answer the questions and accomplish the objectives that have been identified. This often drives the need to incorporate external or cross-organizational data.
Information sharing extends beyond a cultural issue though -- technical challenges often abound as well. For example, agencies may have limited storage resources or incomplete data. It’s incumbent upon IT administrators to recognize and address these challenges so that they can make the best possible decisions for the agency and help the organization progress.
3. Choose a target application and get going
An ideal target application will be one that will have a positive and visible impact for the organization but which is not mission critical. Projects must be able to demonstrate real results and deliver quick wins. A mid-level project typically works best and, as a bonus, can serve as a use case going forward.
Once the application is chosen, it’s best to simply get started. There are many cases of agencies suffering from "analysis paralysis," spending months or years creating models and drafting architecture. That time can be better spent collecting actionable data. In parallel, administrators can and should focus on needed data models, strategies and policies.
4. Select the right technology partner
Administrators should look for technology partners with proven success implementing predictive analytics programs who are also willing to work with a combination of open source and commercial tools to deliver high-value insights.
For example, administrators might look for a partner capable of implementing an edge-cloud approach, where computing power is closer to, or at the location, where it is needed. This allows agencies to react quickly to changing dynamics at a given location, such as a battlefield or disaster relief station, delivering high-value, analytical data and insights to the right resources at the right time.
When choosing a partner, agencies should be sure the organization has the relevant government certifications and can showcase how it helped other agencies kick off or augment their predictive analytics programs. Private-sector partners should also be able to show results tying their contributions to the final outcome, actively demonstrating return on investment.
5. Tie back to core agency metrics
Finally, administrators should avoid getting wrapped up in the technology itself. Agency leaders are less concerned with how many terabytes of data a system is handling than they are with seeing how the results will help current and future missions.
To that extent, federal IT professionals should implement the same type of metrics that are used throughout their organizations. For example, leadership often likes to see metrics related to cost-cutting measures, infrastructure modernization initiatives or cybersecurity improvements. IT administrators can use predictive analytics to easily quantify results and show how they tie back to these mission-critical objectives. It’s not unusual for teams that invest in this technology -- and show proven ROI -- to receive additional funding.
The good news is that, thanks to the cloud, all agencies can now enjoy advanced, predictive analytics. Few need a supercomputer; the cloud has made powerful, deep analytical insight viable for almost any size agency, at little cost. There’s no reason to not get on board. All it takes are a few important first steps to make the leap to advanced, predictive analytics.
Steve Orrin is federal chief technologist for Intel.