Big data's target: Users
- By Rutrell Yasin
- Mar 29, 2012
Government is amassing data at an accelerating rate, and it even has tools available to process and analyze it. But to get real value from all that information, the government must put it into the hands of users, the General Services Administration’s Dave McClure told a Washington audience.
“The challenge is, what do we do with all the data that we are using? How do we sort it, analyze it and get value within the business owner’s context?” McClure, associate administrator with GSA’s Office of Citizen Services and Innovative Technologies, asked March 28 in a keynote address at AFCEA Bethesda Chapter’s Big Data Technology Symposium held in Washington, D.C..
The symposium also included panel discussions with agency managers that focused on detecting and preventing improper payments, the new era in health information exchanges, and building actionable information for law enforcement and homeland security.
What you need to know abut big data
How cloud can improve intell community's analyses
To date, there has been a focus on dashboards, visualization and ad hoc querying oriented around science research, intelligence gathering, and natural disaster and emergency data, which is event-driven, McClure said.
Other areas where the filtering and analysis of large volumes of data, known as big data, are occurring include health, modeling, forecasting and growth prediction, he added.
In an earlier keynote, Dan Vesset, program vice president of business analytics with IDC, gave his company’s definition of big data.
Big data is “a new generation of technology and architecture designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture and/or analysis” of the data, Vesset said.
Big data involves multiple components involving four layers, Vesset said, including infrastructure, which includes the servers on which applications run; data organization and management, which refers to the software that process and prepares all types of data for analysis; analytics and discovery tools; and decision support software.
Tools for discovery and analysis of big data include Apache Hadoop, software that allows for the distributed processing of large datasets across clusters of computers, as well as graph databases, predictive analysis tools, search and discovery software, object-oriented databases and relational databases.
“There are a lot of choices; the key is to understand the sweet spot for each of them,” Vesset said, noting that Hadoop and relational database management systems must co-exist.
The commercial sector jumped into the big data space faster than government with a focus on driving revenue and reducing costs, McClure said. Private-sector companies were spurred by the need to make quick decisions to move to market faster, he added.
On the government side, the dialogue should be framed around improving performance and efficiency but “doing it with less effort and pain,” McClure said.
Intersecting trends will spur efforts to get more data into the hands of business users in government and the private sector, such as the move toward self-service and the consumerization of IT, both McClure and Vesset said.
The goal is no longer just to produce reports but to get “information into the hands of the end user and [let] them manipulate it, do the visualization and analytics in simple, fast, easy terms so they can get value,” McClure said.
Other trends that will spur adoption of big data are the emergence of analytics and search technologies, and the push for data standards, McClure noted.
McClure added that Data.gov remains a powerful experiment in the democratization of data. There is a treasure trove of data that has been opened up across government so big questions can be tackled from a joint or multiple-agency fashion, McClure said.
Rutrell Yasin is is a freelance technology writer for GCN.