Data Center Consolidation


Microsoft announces official SQL Server 2008 R2 date

Microsoft said SQL Server 2008 R2, the next version of its database, will be generally available in May.

Intel unveils experimental 'cloud computer' chip

Intel pushed the outer limits of computing by unveiling an experimental, 48-core processor it described as a single-chip cloud computer.

IBM partnerships extend portable data center capabilities

IBM has extended partnerships with providers of data center infrastructure technology to expand the deployment of the company's Portable Modular Data Center worldwide.

The path to cloud computing is beginning to take shape

Cloud computing still has a lot of uncertainties -- among them a lack of maturity among many of its potential services -- but the path toward this nest era of enterprise computing is beginning to take shape.

5 steps to secure your data center

With the advent of cloud computing, rich Internet applications, service-oriented architectures and virtualization, data center operations are becoming more dynamic with fluid boundaries. The shifting form of computing adds layers of complexity that have broad implications for how IT managers secure the components that make up a data center.

Cloud computing: Winners and losers

Technology companies that expect to beneift from cloud computing must creatively adapt licensing, pricing and revenue models.

Virtualization proves cost effective for some agencies

Capacity planning is a vital component data center managers need to implement in order to achieve power savings and other benefits from virtualization technology, according to IT managers representing two federal organizations.

Agencies face tough questions on how to deploy cloud computing

Security issues, data privacy, the acquisition process, standards and service level agreements were among the chief issues that feds grapple with.

Interior developing cloud infrastructure services

Cloud computing will fundamentally change the shared services model, National Business Center director predicts.

NSF commissions supercomputer to visualize ever-larger data sets

The 2,048-core system, nicknamed Longhorn, will be capable of 20.7 trillion floating-point operations per second, will help researchers keep pace with the explosive rate of data production, TACC officials said.

Cut the cords to storage networking

Agencies running stand-alone Fibre Channel-based Storage Area Networks may be able to reduce the amount of cabling snaking through their data centers, thanks to an emerging converged network protocol named Fibre Channel over Ethernet.

Shawn P. McCarthy

Fusion center approach could be effective in other areas

The fusion center approach used in antiterrorism operations could also be applied to other civilian uses, such as bridge and road monitoring and electronic business reporting.