Virtualization proves cost effective for some agencies
Capacity planning is a vital component data center managers need to implement in order to achieve power savings and other benefits from virtualization technology, according to IT managers representing two federal organizations
- By Rutrell Yasin
- Oct 08, 2009
Capacity planning is a vital component data-center managers need to implement in order to achieve power savings and other benefits from virtualization technology
, according to IT managers representing two federal organizations.
Officials at the Energy Department’s Los Alamos National Laboratory have used virtualization technology to address issues of cooling, limited floor space and power consumption as they sought to ramp up capacity in data centers on the sprawling, 36-mile campus. Data centers on the campus range from new facilities to data centers that are 40 years old.
Two years ago, LANL created a virtual environment in which officials decommissioned 100 physical servers and deployed 250 virtual machines on 13 physical host servers, said Anil Karmel, a solutions architect with LANL.
LANL also decommissioned three data centers by implementing virtualization and reducing the labs’ data-center footprint by almost 50 percent in some areas, he said. Karmel spoke during a session on Oct. 7 called “Ramping Up for Tomorrow’s Data Center” at the Virtualization, Cloud Computing and Green IT Summit sponsored by 1105 Government Information Group, publisher of Government Computer News.
“We achieved $1.4 million in cost savings to date,” because LANL officials applied capacity planning, Karmel said.
“It is very difficult to know what you need if you don’t know what you have,” Karmel said. We implemented tools to measure what we have so we can forecast demand to address the power and cooling issues,” he said.
LANL focused on systems that support institutional side of the lab. Moving forward officials plan to address the needs of programmatic scientific users, which have servers and systems situated in those older data centers.
“So we are looking at leveraging the investment we made in virtualization infrastructure and turn that into an infrastructure as a service offering within the Laboratory,” he said. The IT people supporting the programmatic scientific users can deliver computing capacity on-demand to users as opposed to buying and putting a physical asset into an older data center.
With an infrastructure as a service model, chargeback is important. Life-cycle management is also critical in terms of ensuring that when a machine is provisioned, IT knows who owns it, how long they are going to keep the machine and who will turn it off when the virtual machine is no longer needed, Karmel said.
The move to put 250 virtual machines onto 13 host servers took about nine months with a staff of three people, Karmel said. LANL initially planned to see a return on investment in two years, but got it in nine months. Also, since shutting down three data centers, IT officials performed metrics that indicate they are saving 873,000 kilowatts of power per year.
“So from a power savings [perspective] virtualization does deliver,” Karmel said.
Congress’ virtualization leap
The House of Representatives has also made the leap into the virtual world, said Richard Zanatta, director of facilities for the Chief Administrator's Office, U.S. House of Representatives.
The House has reduced its number of enterprise servers from 140 to 18, he said.
“Our biggest initiative is now virtualizing all members of Congress’ servers. That’s 441 servers across the network," Zanatta said. His team is about 108 or 110 servers into that process.
“We are running 40 or 50 servers depending on their capacity or demand on one physical unit. We’ll keep reducing our power footprint as much as we can,” he said.
From a capacity-planning standpoint, Zanatta’s team monitors power consumption. “I can tell when somebody plugs in a new device. I’ll get an e-mail,” he said. This might not be very important from the standpoint of the cost of power, but is crucial from a heating perspective, he said.
“We model the data center and we feed information into a database where we see heat problems [so we] can address cooling” issues, Zanatta said.
Zanatta said it took eleven and a half months to virtualize the enterprise servers. It involved two separate projects run by two different groups. The return on investment was significant, he said. If they had not implemented virtualization, each member of Congress would have run their own little IT organizations. They would have had to hire a systems administrator and systems maintainer.
Virtualization offers significant advantages, including server consolidation, improved quality of service and even security, said Daniel Menasce, associate dean and professor of computer science with The Volgenau School of Information Technology and Engineering, George Mason University.
Menasce’s team is experimenting in the GMU lab on ways to dynamically and automatically change the allocation of shares or resources in virtual machines without human intervention.
It is important to note that in a single box several virtual machines share the same physical resources. So how do you proportion CPU shares to the various virtual machines as workloads change in each VM so users can get the best service levels out of their application environments?
Researchers at GMU have developed sophisticated algorithms that perform some of these tasks, he said.
Acknowledging that users can achieve significant power savings by moving into the virtual world, Zanatta cautioned against virtual creep. There is the mindset that once you’ve virtualized servers, more and more systems can be virtualized. You could wind up virtualizing systems and services that don’t need to be migrated to a virtual world, he said.
Congress’ IT department is regulated by law as to how many staff they can hire, Zanatta said. “If manpower stays the same and you keep growing virtual servers and more virtual farms there is a payoff that is going to come back and bite you,” he said.
“The complexity is going to continue to grow even with the automated tools,” which will make life easier but they are not here yet, he said. Plus, someone is going to have to manage those automated tools, he noted.
Rutrell Yasin is senior editor for GCN covering cloud computing.