After data center consolidation, beware legacy apps

Agency IT managers discuss progress on consolidation, cloud at AFCEA Bethesda event

Federal agencies are making significant progress in  consolidating data centers, but they shouldn't overlook the technical challenges associated with legacy systems as they combine centers and move applications to a cloud environment, agency managers told attendees at an AFCEA Bethesda Chapter on Jan. 13.

Agency managers on a panel titled “Data Center Consolidation – What’s Next for Cloud Computing?” spoke on a range of issues associated with their data center consolidation and cloud computing plans, including application consolidation, licensing, power management and staffing.

The Office of Management and Budget’s 25-point plan for federal IT reform regards cloud computing as the default approach to purchase IT. The plan includes a cloud-first policy that is expected to help reduce the government’s more than 2,000 data centers by 40 percent by 2015.

Agency managers see the benefits of moving to a cloud infrastructure for the rapid provisioning of systems and testing and development of applications, panel members said.

However, managers in general need to address the technical challenges of hosting legacy applications, which may not rely on configurations that are as common and standardized as newer or commodity-type applications, said Dawn Leaf, senior executive for cloud computing at the National Institute of Standards and Technology.

“I would say that we are focusing on the next steps to get the full benefit,” of IT consolidation and moving applications to the cloud, said Leaf. NIST is working with agencies and industry to develop greater interoperability, portability and security capabilities for the cloud.

Related coverage:

Can agencies cut 800 data centers? Maybe, but here's what's in the way.

Implementing a cloud-first policy? Start with e-mail.

Leaf’s focus on the importance of interoperability and portability in the cloud prompted McClure to ask panel members whether they were consolidating traditional data center hardware and operating systems first and putting off real application consolidation until after that task was complete.

“I think that you have to do some type of application rationalization before a big drag and drop,” said Alfred Rivera, director of the Defense Information Systems Agency’s Computing Services Directorate. For instance, DISA had multiple instances of SAP enterprise resource planning software on different platforms and started to look at ways to rationalize the software at the application layer on a standard architecture.

Energy takes a three-prong approach to the issue of consolidation: housing, hosting and sharing, said William Turnbull, the agency's associate CIO for advanced technology and system integration. Energy has 89 data centers, according to data on federal data centers released last fall.

Housing refers to agency users actually bringing their hardware into Energy’s data center, maybe because their data center was not energy-efficient or secure. Politically, this is the easiest way to begin consolidation. In the hosting scenario, Turnbull’s team will work with users to get their applications onto a virtual machine and retire the older hardware.

Sharing involves working to put multiple instances of the same application onto a single license.

This approach “is a very successful way of slowly, overcoming initial resistance,” by users who think they might be giving up control of their hardware, Turnbull said.

Each agency is in various stages of consolidation and cloud computing efforts.

The State Department is working to reduce 11 data centers domestically down to two, one on the East Coast and the other on the West Coast, said Cindy Cassill, director of systems integration with the Office of the CIO. State has a private cloud, offers infrastructure-as-a-service and is moving toward software-as-a-service hosting applications such as Microsoft SharePoint collaboration software. State hasn’t moved to a public cloud because there are still governance and security issues that need to be resolved, Cassill said.

The Homeland Security department has consolidated six of the 24 data centers the agency is moving into two state-of-the art data centers in Mississippi and Virginia, said Margie Graves, deputy chief information officer at Homeland Security. DHS hopes to consolidate by the end of the year applications from Citizen and Immigration Services, Customs and Border Protection, Immigration and Customs Enforcement and U.S. VISIT hosted in a Justice Department facility.

The agency offers infrastructure- platform- and software-as-a-service through a private cloud to its various components. But it does have public-facing agencies such as the Federal Emergency Management Agency and CIS whose content can be hosted in a public cloud.

DISA provides virtual machines for development, test and production environments for DOD users through the agency’s internal cloud, the Rapid Access Compute Environment. DISA also provides, a collaborative environment for the development and use of open source and DOD software. The Computing Directorate operates 14 data centers across the globe, down from 59 when DISA’s Rivera first started, he said.

Rivera is part of DOD CIO Teri Takai’s group working to reduce DOD’s 772 data centers to a manageable number by 2015. Rivera will focus on virtualization, storage and creating a standardized operating environment.

Energy has about 15,000 federal employees but when governmental contractors who operate DOE’s national laboratories are added, the number grows to 150,000. Within the labs, there has been some resistance to consolidation as machines are brought together in the same locations, Turnbull said.

On the federal level, Energy has been consolidating for a few years now, moving smaller data centers into two primary data centers in Germantown, Md. and Albuquerque, N.M.

Through virtualization, Energy has brought 200 servers down to 100 in those facilities. Energy has also reduced applications as legacy software is retired and through more sharing of applications rather than running multiple instances of the same software.

The part of the cloud that Energy is still working to address is the rapid provisioning of systems. “We stand up additional virtual servers within a certain amount of time much faster than a procurement cycle would get you,” he said. But Energy still has a ways to go, Turnbull noted.

About the Author

Rutrell Yasin is is a freelance technology writer for GCN.

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.