The Citizenship and Immigration Services has enhanced the Homeland Security Department's E-Verify system by using analytics to flag and 'lock' Social Security Numbers that have been used fraudulently.
The open-source cluster computing tool called Spark speeds programming and can run up to 100 times faster than Hadoop Map Reduce.
Hadoop is a must-have big data tool, but it has some drawbacks, including the need for high-level user expertise and conditions for optimal bandwidth.
Public-sector priorities: a strong IT foundation and tools for prepping data.
The move to open data policies are spawning new approaches of data virtualization to exploit and manage an expanding number of government information sources.
The announcement by IBM that it would add Watson-like technologies as a cloud service signals the beginning of a race to build the brainiest cloud.
Fusion-io and Sqrrl improve big data workload efficiency with flash memory, providing significantly more performance at less cost than high-density DRAM systems, the companies say.
Big data platforms, both proprietary and open source, emerge as comprehensive solutions to handle magnitude, complexity and variety of data.
Livermore, Intel and Cray are deploying the uniquely designed Catalyst to explore new frontiers in HPC simulation and big data innovation.
Architects must show how their system designs operate in the field as agencies tackle big data, cloud and mobility issues.
Text analytics is more than just sentiment analysis. It is being used to enable churn analysis, fraud detection, risk analysis, warranty analysis, medical research and other nontraditional use cases.
At the 2013 Executive Leadership Conference, industry and government leaders called for practices that could cut costs and show returns.
Risk assessment programs weigh factors beyond the traditional considerations of parole boards and seem to be having an effect.
The Enterprise Management Decision Support System draws on 3,500 database systems to create a picture of military readiness.