Splunk Analytics for Hadoop lets users explore, analyze and visualize data natively within the open-source programming framework.
A Naval Research Lab team develops a more efficient set of algorithms to increase the resolution of its maps.
The Graph 500 ranks HPC systems not by petaflops, but on how well they handle data-intensive workloads. Livermore's Sequoia, an IBM Blue Gene Q system, leads the pack.
Metadata is valuable for extracting knowledge from smaller subsets of data for intelligence-gathering as well as for energy, weather and public safety research, experts say.
Big data can save government $500 billion with the right technology in place, but most IT execs say their agencies lack an adequate strategy, a MeriTalk survey finds.
The SALSA tool developed at the Pacific Northwest National Laboratory taps the lab's supercomputing power to pore over billions of posts in seconds.
SCAP sets standards to ensure products work together, while Einstein is evolving into an automated tool that will not only detect, but block, malicious code.
A growing number of products can help automate IT security; Nevada's DOT found they can help in other areas, too.
The agency gets ready to fire up the world's third largest data center, with zettabytes of capacity, as it begins work on another new high-performance center.
Big data, analytics, mobile computing and social media will blend together, with services doled out by cloud brokers.
Information Builders' iWay 7 platform gives analysts direct access to information from any source with more than 300 pre-packaged integration components.
Interoperability standards will create a world of interconnected clouds fraught with opportunities and security risks, experts say.