Hadoop is a must-have big data tool, but it has some drawbacks, including the need for high-level user expertise and conditions for optimal bandwidth.
A Microsoft research paper describes the unconventional approach that could double data center efficiency.
The firm debuts rapidly deployable data center modules that provide more control over infrastructure at lower cost.
Livermore, Intel and Cray are deploying the uniquely designed Catalyst to explore new frontiers in HPC simulation and big data innovation.
The agency also consolidated all IT offices and staff under the leadership of GSA CIO Casey Coleman to eliminate duplication and streamline IT operations.
If Google is involved, is a floating data center that far-fetched?
OnCourse, a software as a service provider for the K-12 education market, contracts for analytics-based defensive services following a series of denial-of-service attacks.
CipherPoint data security software prevents privileged IT administrators, attackers from accessing sensitive information across on-premise and cloud-based collaboration systems.
The Windows Azure U.S. Government Cloud for state, local and federal agencies will be hosted in U.S. data centers and managed by U.S. citizens.
Surges caused by arc fault failures have melted equipment and will delay operations at the $1.5 billion center for a year.
Large power transformers, which regulate voltage along the electricity supply chain, are soft spot in the power grid, experts say.
With NSF-backed funding, researchers will create energy-efficient optical network -– with silicon photonic switches –- to handle massive amounts of data.
The goal of any consolidation project is savings. Here’s a checklist to help make sure things go right.
Two data center operators recently announced plans to open colocation facilities targeting government customers.