analytics (metamorworks/


How DataOps can unleash the power of AI

Under the Industries of the Future Act, Congress proposes to spend $10 billion on artificial intelligence, quantum computing and 5G over the next 10 years. Further, the recently released National Security Commission on Artificial Intelligence report highlights how the AI revolution will impact the U.S. economy, national security and welfare.

Recognizing the significance of these groundbreaking technologies is an important first step, but now comes the difficult job of unlocking value commensurate with the investment being made. This will require new thinking about the management and delivery of use cases that are in the best public interest. Now that emerging technologies are leveraging computing power, network capacity and AI methodology, there are three key measures that will maximize citizen value.

Use case governance

To be candid, $10 billion can only go so far if it is spread too broadly across the federal landscape. Rather than giving every agency a piece of the pie, there must be a clear process for creating, prioritizing and measuring use cases that will deliver the most impact. Investing in technology without a clear picture of the outcome is a recipe for inefficiency.

To avoid this scenario, we need clear objectives with measurable outcomes, focused on a few high-priority agencies and projects. These use cases should be evaluated on the potential public benefit, availability of supporting data, capacity of the government or citizenry to easily adopt and their ethics. Many private organizations have engaged in value engineering or vision workshops to identify and align leadership around potential AI use cases.  Something similar will be required of the government, likely coordinated through the Chief Data Officer in conjunction with department CDOs under the auspices of the Federal Data Strategy. This framework would not only create clarity on the expected outcome, but also enable metrics for progress and the solution’s impact.

Managing the people’s data

Government is obligated to ensure the people’s data is used efficiently, effectively and ethically in applications, but the public has both expectations and skepticism of government data use. More often than not, federal institutions are held to a higher standard than the private sector with respect to the security and appropriate use of data.

That’s especially true for emerging technologies like artificial intelligence. Any AI use case will only be as useful as the data -- and the data pipelines -- that power the solution. These pipelines must include glossaries and dictionaries that help to explain the data and its appropriate treatment and use. They must also feature real-time data streaming of specific, reusable assets, replacing both the rigid data warehouse model as well as the overwhelming and complex data lake model. Creating self-service, easy access for the right data at the right time for identified use cases will accelerate delivery and create value more quickly. For this, too, the Federal Data Strategy will serve as the North Star.

Enter DataOps

Data operations or DataOps is an agile, process-oriented approach to improve the quality and reduce the cycle time of realizing value from data analytics. Like its better-known sister concept DevOps, it removes friction and accelerates how organizations deliver and maximize the value of key IT assets -- in this case, data. DataOps is really a strategic mindset that encompasses people, processes and technology to streamline decision-making.

With an increasing focus on integration and automation, DataOps breaks down the silos in IT operations, allowing governments to move data at the speed of change while providing more information on the different source data collected. It creates clear models for the development, promotion and management of data, algorithms and applications at scale. As mission-driven use cases evolve, DataOps powers a strategic shift that treats data as an integral piece of overall mission strategy, and it will be absolutely essential to agencies driving real outcomes with AI.

It is often said that data science is a team sport. This is because data, analytics and emerging technologies like AI must be aligned to deliver value at speed. AI is only as good as the information on which it is built, and applications will only deliver value so long as there is a continuous review of their algorithms. Initiatives such as the Joint Artificial Intelligence Center  need a combination of clarity of mission, accurate and actionable data and a defined DataOps methodology to reach their full potential and use AI to solve large and complex problem sets that span multiple combat systems.

DataOps creates data pipeline efficiency that allows governments to automate processes and ensure changes made at the beginning of the pipeline are accounted for throughout. With this foundation, governments can create systems that enable analytic opportunities for efficient, effective and ethical AI that delivers mission value.

About the Author

Andrew Churchill is vice president of federal at Qlik.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected