Using SAS Analytics, city officials from Cary, N.C., monitor budgets across departments, improve police efficiency and track customer service goals.
The open-source cluster computing tool called Spark speeds programming and can run up to 100 times faster than Hadoop Map Reduce.
Hadoop is a must-have big data tool, but it has some drawbacks, including the need for high-level user expertise and conditions for optimal bandwidth.
Public-sector priorities: a strong IT foundation and tools for prepping data.
The move to open data policies are spawning new approaches of data virtualization to exploit and manage an expanding number of government information sources.
Fusion-io and Sqrrl improve big data workload efficiency with flash memory, providing significantly more performance at less cost than high-density DRAM systems, the companies say.
Big data platforms, both proprietary and open source, emerge as comprehensive solutions to handle magnitude, complexity and variety of data.
Livermore, Intel and Cray are deploying the uniquely designed Catalyst to explore new frontiers in HPC simulation and big data innovation.
Architects must show how their system designs operate in the field as agencies tackle big data, cloud and mobility issues.
Text analytics is more than just sentiment analysis. It is being used to enable churn analysis, fraud detection, risk analysis, warranty analysis, medical research and other nontraditional use cases.
At the 2013 Executive Leadership Conference, industry and government leaders called for practices that could cut costs and show returns.
Risk assessment programs weigh factors beyond the traditional considerations of parole boards and seem to be having an effect.