search (32 pixels/


Solving the time-to-value data challenge

The global data explosion is well documented. In November 2018, IDC predicted that by 2025, worldwide data creation would grow to 163 zettabytes -- a zettabyte is equal to a trillion gigabytes.

Government's datasets are similarly multiplying, and that puts enormous pressure on government IT systems and the personnel who manage them, especially as agencies strive to meet the data-driven vision laid out in the Federal Data Strategy. It’s difficult to even collect and store that much information, never mind analyze it. But that’s why agencies collect data in the first place: to inform critical decisions that deliver mission value and improve citizen services.

Even if agencies were able to wrap their arms around every piece of relevant data, the larger issue is how fast technologists can make that data actionable. Fast and reliable analysis is a requirement when the mission involves warfighter safety, insider threats, cyber attacks or veteran health. Turning data into decisions requires a fast time to value, and this is one of the biggest data challenges government agencies currently face. 

See the forest for the trees

With the explosion of connected devices and endpoints, the amount of data collected presents significant challenges. Employees, contractors and citizens are using mobile devices to communicate with government networks. Internet-of-things devices, satellites and sensors produce constant streams of information, making it difficult to quickly search for data attributes such as time, location and identity. It can take months -- or even years -- for agencies to process their data and make enough sense of it to impact mission outcomes.

To compound these challenges, data is often duplicated across government agencies, creating redundancies that tie up more physical, intellectual and financial resources than necessary in today’s hyper-connected world.

Agencies can make strategic decisions only if they can see the forest for the trees when it comes to all of the data that they collect. To maximize impact, they need both a high-level view of their big data and the granular insights from analysis. Ideally, data visualization dashboards would provide insights for high-level decisions on what data to keep, what to drill down on and what should be processed using machine learning tools. Of course, all these decisions must be made in a timely manner so that the insights are relevant to the mission at hand.

Search-powered speed 

Overcoming challenges of data-collection size and faster time to insight requires near real-time searches of all types of data no matter where it’s stored: on current and legacy systems, locally or in the cloud. Federated search tools previously required data to be moved to a central location for analysis, but modern search solutions allow the data to remain close to where it’s created -- both locally accessible and available to the entire enterprise.

For agencies to effectively leverage all the data at their disposal, it’s important they know what data is actually available. Modern search solutions provide a simple interface for exploring large swaths of data and helping users understand what’s accessible and how it can be used.

Think of it this way: Search engines like Google let users ask questions with very little knowledge of the data and get relevant answers within seconds. That’s the type of search-powered speed and analytics government needs.

Global real-time search platforms can address the entirety of an agency’s data today. These platforms deliver search at the scale agencies require to address the exponentially increasing volumes of information coming in and then package answers in a way that enables insight for mission leaders.

Modern interfaces make queries accessible for non-technical professionals, democratizing data analysis through easy-to-use dashboards and visualization tools. Empowering data-driven decision-making as close to the mission as possible can give agencies the agility and speed required to stay one step ahead of threats, optimize resources and transform service delivery.

Helping data reach its potential

For government to get the most out of its data, it’s clear that speed -- having as short a time to value as possible -- is of critical importance. Modern search technologies can provide a wider view of all the data an agency collects, and the near-real-time search and analysis capabilities of some of these technologies deliver fast, actionable insights.

The best search solutions don’t need to scour an entire corpus of information to deliver an answer. Instead, they intuitively know where the most relevant data resides inside massive datasets and can quickly drill down into those areas, providing the scale and speed for timely insights. This is the solution to the time-to-value challenge.

Searching and analyzing global data supports everything from IT modernization strategies to geospatial data’s impact on military readiness. Quick-turn analysis can help agencies spot disease vectors, fight human trafficking and safely deliver humanitarian aid to disaster-stricken regions. The faster agencies can find answers within their data, the better their mission outcomes will be.

About the Author

George Young is vice president us public sector with Elastic.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected