Virtualization has reduced data center footprints, but little has been done to disentangle the duplication and complexity of information and applications sitting in those servers, observes Francis Hsu, an information executive working for the Homeland Security Department.
The GCN Lab offers a few ideas on new products worth checking out during this year's FOSE exposition.
Microsoft said SQL Server 2008 R2, the next version of its database, will be generally available in May.
Intel pushed the outer limits of computing by unveiling an experimental, 48-core processor it described as a single-chip cloud computer.
IBM has extended partnerships with providers of data center infrastructure technology to expand the deployment of the company's Portable Modular Data Center worldwide.
With the advent of cloud computing, rich Internet applications, service-oriented architectures and virtualization, data center operations are becoming more dynamic with fluid boundaries. The shifting form of computing adds layers of complexity that have broad implications for how IT managers secure the components that make up a data center.
Cloud computing still has a lot of uncertainties -- among them a lack of maturity among many of its potential services -- but the path toward this nest era of enterprise computing is beginning to take shape.
Technology companies that expect to beneift from cloud computing must creatively adapt licensing, pricing and revenue models.
Capacity planning is a vital component data center managers need to implement in order to achieve power savings and other benefits from virtualization technology, according to IT managers representing two federal organizations.
Security issues, data privacy, the acquisition process, standards and service level agreements were among the chief issues that feds grapple with.
Cloud computing will fundamentally change the shared services model, National Business Center director predicts.
The 2,048-core system, nicknamed Longhorn, will be capable of 20.7 trillion floating-point operations per second, will help researchers keep pace with the explosive rate of data production, TACC officials said.