3 trends for government data analytics

INDUSTRY INSIGHT

3 trends for government data analytics

In 2015, data discovery moved out of the IT shop and onto users’ desktops. Tools such as Tableau, QlikView, Domo and Datameer augmented existing enterprise IT systems and allowed individuals to apply sophisticated visualization capabilities to vast amounts of existing data without writing code. The days of submitting requirements, designing and developing solutions and testing and deploying systems over many months (or years) for all things data-related have faded.

With these new discovery tools, data access and analysis can now happen at the speed of business. This change is driving three important trends government agencies will see in 2016.

Data simplification: Improving data access and usage are at the top of the list for most agencies. They want the average end user to have an easier time retrieving and analyzing data. If users can bypass database administrators and SQL gurus, who build and maintain enterprise IT systems on longer development cycles, they will do it. This easy access to data saves time, reduces cost and accelerates decision making.

Discovery tools are helping users quickly transform raw data into meaningful information that lets them slice and dice information in many different ways. While this capability exposes the power of agency data, it could also lead to misinterpretation of data if not properly used.

Data transparency: Through open-data initiatives and websites, such as Data.gov, anyone with Internet access and a data discovery tool can download the government’s non-sensitive data, blend it into reports or dashboards and otherwise use it as they wish.

I recently helped a federal agency answer questions from a University of North Carolina graduate student who downloaded the agency’s public data. We helped the student expand his analysis by using a data discovery tool to provide a variation of the same dataset to support his project, illustrating how discovery tools can capture data from external sources (e.g., data.gov, social media) and then combine it with internal sources (e.g., spreadsheets, system of record databases, data warehouses). This access lets users answer their own questions without requiring involvement from their enterprise IT shop.

Data security: Security will become a larger issue because there are more opportunities for compromise, now that more individuals can access more data. Before, system administrators implemented rules that dictated which people could access particular data through enterprise IT systems. Now, data discovery tools can directly connect to databases and bypass these rules if security protocols aren’t carefully constructed, putting government and its information at risk. Rather than addressing the most common vulnerabilities through a secure central access point, government must now look to protect data at various levels and entry points to ensure it is only accessible to those users with the proper authorization.

The potential is at our fingertips

Government agencies are turning to chief data officers to play a larger role in data governance. With the popularity of self-service data discovery tools, CDOs must provide guidance so their agencies can achieve data simplification, transparency and security. They can focus on “reducing the complexity of getting and using data and … being able to simplify the retrieval of the data that you need.”

Otherwise, government won’t be able to truly realize the full potential of the tools and the enormous amounts of data it (and the public) has at its fingertips.

About the Author

Mark DeRosa is director of business intelligence and analytics at Definitive Logic.

inside gcn

  • When cybersecurity capabilities are paid for, but untapped

Reader Comments

Tue, Jan 5, 2016 Mark DeRosa

I totally agree with you Louis...poor data quality is an obstacle that can (and does) affect widespread use and adoption as well as future investments in analytics. Since data usage/adoption is directly related to the quality of data, it is wise to invest more time/resources into data quality (at the points of origin) to yield much better returns on investment. Investing a little more time up front can pay big dividends later in terms of time and cost savings, not to mention the benefits of much more accurate and defensible data-driven decisions. Thanks for taking the time to read and comment on this article!

Wed, Dec 23, 2015 Louis Tominack

Excellent article demonstrating the leap in data analytics as a management technique that can used to actually make more informed and timely decisions. A significant impediment to robust adoption still exists in the form of poor data quality; that leads to inaccurate results and poor decisions.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group