Metadata is valuable for extracting knowledge from smaller subsets of data for intelligence-gathering as well as for energy, weather and public safety research, experts say.
The Graph 500 ranks HPC systems not by petaflops, but on how well they handle data-intensive workloads. Livermore's Sequoia, an IBM Blue Gene Q system, leads the pack.
Big data can save government $500 billion with the right technology in place, but most IT execs say their agencies lack an adequate strategy, a MeriTalk survey finds.
The SALSA tool developed at the Pacific Northwest National Laboratory taps the lab's supercomputing power to pore over billions of posts in seconds.
SCAP sets standards to ensure products work together, while Einstein is evolving into an automated tool that will not only detect, but block, malicious code.
A growing number of products can help automate IT security; Nevada's DOT found they can help in other areas, too.
The agency gets ready to fire up the world's third largest data center, with zettabytes of capacity, as it begins work on another new high-performance center.
Information Builders' iWay 7 platform gives analysts direct access to information from any source with more than 300 pre-packaged integration components.
Big data, analytics, mobile computing and social media will blend together, with services doled out by cloud brokers.
Interoperability standards will create a world of interconnected clouds fraught with opportunities and security risks, experts say.
A peering arrangement gives schools fast access to Microsoft cloud services, offering improved collaboration, the ability to quickly exchange large data sets and faster app development.
Big data's roots run deep. Here's a look at key events over the past 30 years that have affected the way data is collected, managed and analyzed and that help explain why big data is such a big deal today.