Big data's roots run deep. Here's a look at key events over the past 30 years that have affected the way data is collected, managed and analyzed and that help explain why big data is such a big deal today.
Florida's Department of Children and Families is using LexisNexis Risk Solutions, which creates a profile based on customer-supplied information, to authenticate people applying for public assistance.
Commerce Department CIO offers insights on how to think about big data in ways that will keep its challenges down to size.
The annual conference and expo kicks off Tuesday, tackling the key challenges facing the public-sector IT community.
New analytic technologies will help agencies and investigators look deeper into behavioral patterns to combat some of the more sophisticated fraud schemes on the horizon.
Google Earth Engine collects and converts millions of Landsat images from USGS into a picture of how the Earth's surface has changed.
Scientists at the Energy Department's Los Alamos National Lab are working toward building an exascale computer that by 2020 could be powerful enough to model the human brain, cell by cell.
MQTT, proposed as an OASIS standard, can help agencies manage all the data generated by sensors, mobile devices and other machine-to-machine networks.
A MeriTalk survey finds that many agencies lack the storage capacity, computing power and personnel to handle the data they have.
The XSEDE research network operators say bandwidth and networking tools are key to moving large files between high-performance systems.
New add-ons for the spreadsheet program can combine Census, weather and demographic data to aid evacuations in an emergency.
Fiserv's payment accuracy and fraud solution helps agencies reduce the overall loss of cash associated with improper disbursement.