Executive trying to jump across a missing span in a bridge

Agencies hot for big data, but plans, resources are lacking

Big data analytics has the potential to save nearly $500 billion — or 14 percent of agency budgets — across the federal government, and 69 percent of federal IT executives in a new survey say deriving more value out of the troves of data will increase efficiency, foster smarter decisions and deepen insights.

However, just 31 percent say their agency has an adequate big data strategy, according to the survey of 150 federal IT executives by MeriTalk.

Sixty-nine percent of the surveyed IT executives think deriving more value out of the troves of data stored in agency databases will increase efficiency, foster smarter decisions and deepen insights, according to the report “Smarter Uncle Sam: The Big Data Forecast,”  which was underwritten by EMC Corp.

Big data is defined as a volume, velocity and variety of data that exceeds an organization’s storage or computing capacity for accurate and timely decision-making. Public-sector agencies have worked for years on complex, analytic projects in many domains before the term ‘big data’ came along. What has changed, according to industry experts, is that the cost of computing has come down, unlocking capabilities for agencies to analyze and find hidden value in data.

As a result, agencies are putting in place the building blocks to tap into the hidden value of big data. For instance, nearly one-fourth of the federal IT executives have launched at least one big data initiative, investing in IT systems and solutions to improve data capture, processing and storage as well as identifying challenges that big data can solve, according to the report. Federal agencies are spending money on technology to:

  • Increase server storage capacity to house and analyze big data
  • Determine bandwidth needs for big data storage and analytics
  • Deploy advanced data mining practices

Sequestration budget cuts pose a significant risk to launching new big data programs, federal IT executives said. Forty-one percent are experiencing budget cuts of more than 10 percent as a result of sequestration. The executives identified several sequestration casualties, including: training and workforce development (51 percent), hardware upgrades (48 percent), software upgrades (41 percent) and new applications development (40 percent).

To prepare for deeper analysis of data, agency managers should significantly increase data management efforts, ideally tagging 46 percent of agency data and analyzing 45 percent, the report states. In five years successfully leveraging big data will be critical to fulfilling agency mission objectives, 70 percent of the IT executives said. Big data will help fulfill agency missions by improving processes and efficiency (51 percent), enhancing security (44 percent) and helping to predict trends (31 percent).

The findings align with other surveys focusing on the use of big data analytics in government and business. A TechAmerica Foundation study of nearly 200 public sector IT professionals released in February indicated that both federal and state IT officials think big data analytics can have real and immediate impacts on how governments operate, from helping to predict crime to cutting waste and fraud.   

The survey, commissioned by SAP AG, and conducted by pollsters Penn Schoen and Berland, cites the potential of big data analytics to improve lives and save money. But federal and state governments are actually deriving value from specific projects.

For example, big data and text analytics are being applied to agency projects that deal with large amounts of unstructured data, such as NASA’s analysis of airline safety reports and a Homeland Security Department-funded bio-preparedness collective. Newly available data on lightning activity within clouds gives the National Weather Service, NASA and the military better warnings about severe weather.

Predictive analysis is growing as a crime-prevention tool, as city police departments such as those in Baltimore and Philadelphia use analytic tools to parse large volumes of data to forecast patterns and prevent crimes. Analytics is also fueling the Air Force’s efforts to improve patient care and support research into preventive medicine and disease management, making it easier for clinicians to comb through data to find meaningful insights.

Still, many organizations cannot gather business insights from big data fast enough to quickly make informed decisions, according to a recent survey of 200 business professionals in large organizations conducted by IDG Research Services and Kapow Software. More than 85 percent of business and IT leaders agreed that big data offers substantial value in its ability to make more informed business decisions and foster a data-driven organization. However, more than half of the respondents (52 percent) rate big data project success so far as lukewarm (or somewhat successful) and only 23 percent of them perceive big data projects to be a success. The inability to automate structured and unstructured data quickly and effectively is among the biggest challenges with 60 percent of the respondents noting that big data projects typically take at least 18 months to complete.

About the Author

Rutrell Yasin is is a freelance technology writer for GCN.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected