Understanding the impact of four revolutionary technologies will help IT managers and developers consider new ways to attack their own problems and build better software faster.
The city of Raleigh, N.C., is reaching out to partner municipalities to maximize its big data potential.
Buffalo uses customer service software to create density maps of citizen 311 complaints and orchestrate 'clean sweeps' of two- and three-block areas.
IBM announced three cloud-based Smarter Cities management centers, which will help cities use their own data to gain insight into citizen services and improve decision making.
In July Chicago will mount sensors on light poles, the first stage of a big data collection and analysis system that the city plans to open up to other jurisdictions.
A use case shows how big data can be used, what business need can be met through that use and what needs to happen in order to make that use case a reality.
The all-day event showcased the wide array of DARPA projects addressing national security challenges posed by the information revolution and the availability of sophisticated information technologies.
A new report from the National Association of State Chief Information Officers finds that open data initiatives are advancing, enabling state and local governments to create innovative ways of delivering data to individual, business and government consumers.
Trusted Information Exchange Service for Microsoft CityNext gathers data from hundreds of sources then filters the information by relevance and delivers it via dashboards, email, text or phone call alerts.
A recent RAND report said that a distributed cloud system could help the Navy deal with its growing amount of intelligence, surveillance and reconnaissance data.
The Postal Service inspector general's office built its own data analytics tool -- the Risk Assessment Data Repository -- to help investigators identify high-value targets for audit and investigation, reducing the time required to close a case and increasing the amount of money recovered.
The Catalyst supercomputer at Lawrence Livermore National Laboratory is being made available to industry and academic collaborators to test big data technologies, architectures and applications.