Complexity could jeopardize data center consolidation

Federal data center consolidation efforts are increasingly complex, which could jeopardize potential cost savings, according to a new survey sponsored by Juniper Networks.

Agencies have to shrink and grow at the same time. As agencies consolidate, federal IT managers have to address the need for increased computing capacity, according to the report, Consolidation Conundrum.

Federal IT managers surveyed estimate their computing needs will increase by 37 percent over the next five years. Moreover, they estimate their data centers will need to be scaled up by 34 percent to meet their growing needs.

The report is based on survey responses from more than 200 federal professionals conducted by MeriTalk and sponsored by Juniper Networks in June 2011. The report highlights the complexity created by consolidation. Sixty percent of the respondents say they are running 20 or more operating systems in their data centers, and 48 percent use 20 or more management software applications.

"Federal data center consolidation will stall when the cost of managing the complexity approaches the savings captured from consolidation,” said Brian Roach, vice president of federal sales with Juniper Networks. “Some data centers may already be approaching the threshold.”

Related coverage:

Kundra: 'Golden source' of data center savings due in the fall

Only 10 percent of those surveyed think federal agencies will meet OMB's mandate of consolidating 800 or more data centers by 2015. Nearly a quarter – 23 percent -- think the government will have more data centers in 2015 than now.

Virtualization of applications and systems also creates challenges for federal IT professionals. Survey respondents note that they virtualize 38 percent of workloads today. However, 70 percent said that increased latency or delay for applications and security services and policies is a problem.

Further, 69 percent report that the unpredictability of system latency is a problem. With virtualization expected to handle 64 percent of workloads in federal data centers by 2015, both latency and unpredictability are challenges IT managers will face, the report states.

To solve these issues, federal IT managers say their agencies must improve network bandwidth and simplicity while maintaining security.

Additionally, agencies should adopt both an evolution and revolution approach. For instance, multiple remote sites can be consolidated with server virtualization or solutions that connect two or more data centers. Meanwhile, data center fabric technologies can create much simpler single data center architectures which scale, the report recommends.

And agencies should identify and group applications into data centers by growth requirements. For example, they could move the high-growth applications to fabric technologies for scale. Manage the low-growth applications to virtualized infrastructure for cost, the report states.

About the Author

Rutrell Yasin is is a freelance technology writer for GCN.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected