The General Services Administration's USASpending.gov website is using the GCE Big Data and Analytics Cloud for storing and managing large volumes of data.
After tornadoes ripped through nearby Springfield, the technology director of Dedham decided it was time to move data backup and recovery to the cloud.
The federal government's big-data initiative fills key research and development gaps where industry fears to tread.
A new facility at DOE's Argonne National Laboratory will develop tools to extract knowledge from petabytes of data and help researchers use their time more efficiently.
Visualization and analytics will require cultural anthropologists and social scientists to help organizations understand how humans process information.
Highway safety groups' strategy, called Toward Zero Deaths, is ambitious, but data analytics combined with the "Four Es" make it possible.
Shawn Henry, former head of FBI's cyber crime team, says private-sector networks lack adequate defenses and require the same level of intelligence available to government networks.
Analytics tools are available to detect attack patterns, but agencies need to take a few steps before gaining the confidence to act on the intelligence they provide.
Hone, a tool being developed at the Pacific Northwest National Lab, links network traffic with an application, making it easier to find the source of an IT compromise.
The Data Transparency Coalition will lobby for passage of the DATA Act, which would make the Recovery.gov website a permanent portal for standardized reporting on all government spending.