Virtualization's next steps
Agencies look beyond servers, toward virtualizing desktops, applications and storage
- By Rutrell Yasin
- Feb 09, 2009
Virtualization is on the verge of extending its reach into agency enterprises.
During the past year, server virtualization, which is the ability to run multiple instances of operating systems concurrently on a single hardware system, gathered momentum in the government sector. But increasingly, federal and state agencies are expanding — or at least thinking about expanding — beyond servers to apply virtualization to applications, desktop PCs and network infrastructures.
The Defense Information Systems Agency, for one, is taking virtualization into the cloud.
DISA recently deployed the Rapid Action Computing Environment, a cloud-computing infrastructure that lets Defense Department personnel quickly provision virtual machines so they can test and develop applications before putting them to real use.
Through a common Web portal, DOD and military service users can purchase a virtual machine. Within 24 hours, it will be set up for them, said Alfred Rivera, director of DISA’s Computing Services Directorate. They’ll pay for it on a monthly basis, and when they are finished, the virtual machine will be decommissioned, he said.
“We’re working to fine-tune some of the security issues but, in essence, it allows our customers to provision a virtual machine with memory, storage and [other] capability so they can download their applications to do test and development before they migrate to a pure production environment,” Rivera said.
Meanwhile, officials at the Defense Health Information Management System (DHIMS) program are moving forward with application virtualization technology that gives clinicians at Camp Lejeune, N.C., remote access to patients’ medical records stored in AHLTA, the military’s electronic health records system.
However, the most common use of virtualization is still for servers, as agencies such as the Environmental Protection Agency look to consolidate their many server rooms.
States also are getting into the act. Arizona’s Department of Environmental Quality has implemented blade servers and virtualization software to reduce computing costs and save energy and space in the agency’s data center. And Pennsylvania has embarked on an initiative to virtualize the state’s entire data center.
Moreover, to meet the rising demand, vendors such as Citrix Systems, Hewlett-Packard, Microsoft and VMware are pushing virtual infrastructures or solutions that encompass the various flavors of virtualization.
Server virtualization yields results
Virtualization can make a single physical resource — such as a server, operating system or storage device — appear to function as multiple resources. Or it can make multiple physical devices appear as a single resource.
Before DISA could move virtual machines to its on-demand computing model, it had to tackle server sprawl, power consumption and, most of all, the underutilization of its infrastructure.
As a service provider to the military services and DOD agencies, DISA works with its customers on application development and any type of virtualization at their own facilities.
“Each customer comes with their own requirements, and we work with them on an individual basis," Rivera said. "And sometimes they come with their own design."
Underutilization of resources is common in data centers. On average, if an Intel-based server that runs Microsoft Windows — known as a Wintel machine — is not virtualized, it is probably running at about 7 percent utilization, and that’s a liberal estimate, Rivera said.
So DISA must convince its customers that it makes sense to move to virtualized servers to a common platform. To save money, the agency has standardized its infrastructure for Wintel and Linux servers on VMware’s ESXi virtualization platform, Rivera said.
Several years ago, DISA adopted capacity-on-demand contracts with original equipment manufacturers. The agency buys capacity as a utility and only pays for what it uses. Hewlett-Packard provides DISA's Windows and Linux environments, in addition to a virtualization solution on top of that capacity-on-demand contract, he said.
VMware ESXi users can quickly create virtual machines through a menu-driven start-up and automatic configurations. It lets operations managers create virtual machines or import a virtual appliance with direct integration between VMware ESXi and the VMware Virtual Appliance Marketplace.
In the Unix environment, DISA is creating virtualization with Sun Microsystems’ Logical Domains, which partitions workloads on one physical server and HP’s virtual server environment for HP-UX systems.
Within the Wintel/Linux infrastructure, DISA has about 4,500 server environments in its data centers. The agency has virtualized about 750 of them.
The biggest challenge the agency faces is ensuring that applications can move seamlessly from the physical to virtual worlds because the agency does not own the application, Rivera said.
Another challenge is more cultural. “I have had to prove to customers that the move to a virtualization environment doesn’t [degrade] performance or efficiencies,” he said. “It is educating a customer base that their application [still] sits in their own domain where it is under their own control.”
Meanwhile, EPA is just starting a multiyear effort to address server sprawl in computer rooms spread across the country.
The agency has one main data center located in Raleigh, N.C., that hosts all the agency’s enterprisewide applications. The facility has about 450 servers with 120 terabytes of storage and a petabyte of data on tape.
However, the agency has 40 computer rooms, small data centers that officials want to drastically cut down to a few, said Myra Galbreath, chief technology officer and director of EPA’s Office of Technology Operations and Planning.
EPA started dabbling with virtualization about four years ago with the implementation of IBM P Series Unix-based servers, she said. That system offered logical partitioning of workloads, and as a result, EPA data center managers were able to put about 36 virtual instances across six physical servers.
Later, EPA adopted 3Par storage technology, which lets agency managers support increased storage provisioning and data duplication. That helped reduce the volume of data that EPA needed to replicate to its disaster recovery site.
Last year, local sites and laboratories started to get into server virtualization, Galbreath said. The agency is trying to assess what has worked best, so officials can set a standard that all local sites can adopt.
EPA also has signed a memorandum of understanding with the Green Grid, a consortium of industry and government organizations that promotes energy-efficient computing, to study how the small computer rooms can reduce power consumption.
“There has been a lot of work on large data centers but not a lot on small computer rooms," Galbreath noted. "And lots of government [agencies] have these small computer rooms."
As EPA tackles server virtualization in small, decentralized computer rooms, Arizona’s Department of Environmental Quality has achieved a more energy-efficient data center with HP blade servers and VMware software.
The agency first tested virtualization of its e-mail system. But based on e-mail volume and system configuration, the information technology team realized that the system wasn’t the best candidate, said Ron Hardin, the department’s CIO. The IT team looked toward mission-critical systems, such as those associated with environmental research, data warehousing and accompanying front-end applications, and the team decided to start there.
After doing an assessment, the team identified about 65 servers that it could virtualize.
The department now runs applications from those machines on seven HP blade servers.
The migration to virtual machines has reduced energy consumption by 30 percent and could lower maintenance and support costs by 40 percent, Hardin estimated.
Future plans are for agency officials to move software-as-a service and service-oriented architecture applications onto the virtual environment.
A place for desktops
Just as all applications might not be suited for server virtualization, desktop virtualization might not be suited for all environments.
Users typically don't notice when their organization virtualizes a server environment, said Jim Smid, data center practice manager at Apptis Technology Solutions (ATS). Users don’t know whether it is virtualized or deployed on a physical server -- it makes no difference to them.
But desktop PC virtualization is a different story.
In such a setup, users have a thin-client monitor, keyboard and small appliance to pull their unique desktop image and applications from a server that resides in a data center. Such an environment makes it easier for administrators to manage and secure the desktop because users basically have a diskless workstation. But mobile employees, for example, who need to work anywhere, anytime might resist such a system.
Yet, during the past year, there has been a push toward desktop virtualization in the government sector, especially in controlled environments, such as training facilities and laboratories, he said.
For example, many military bases across the country have facilities in which they bring in groups of people for training on software or other technologies, he said. IT administrators need to give those machines a baseline configuration that would allow for different environments.
So desktop virtualization is a cost-effective way for administrators to provide customized desktops to users and then refresh them for the next round of people, Smid said.
“There are tools available to very quickly roll out a consistent image to a lot of different folks,” Smid said. That gives them the ability to work independently without needing administrators to deploy physical infrastructure for everyone who receives the training, he said. Military facilities are deploying desktop virtualization solutions from Citrix and VMware for such deployments, he said.
EPA is considering desktop virtualization but only for specialized environments in which employees need access for routine administrative duties and secure information, Galbreath said.
“We’re treading very carefully on that,” she said. “But right now a lot of those folks have two machines” — one for the confidential information, the other for e-mail and Internet connectivity.
“We’re trying to evaluate how we can use desktop virtualization and still ensure the security of keeping the two separate,” she said.
Applications on demand
The Defense Health Information Management System Program Office is pushing forward with application virtualization to give clinicians remote access to patient medical records in the AHLTA electronic records system.
DHIMS' information management/information technology solutions help collect, manage and share health data throughout DOD.
“We have over 100,000 end-user devices and the three services — Army, Navy, Air Force — have their own requirements for those end-user devices that we want to be respectful of,” said Capt. Michael Weiner, DHIMS' chief medical officer.
“That has given us some challenges," he said. "So we have looked at the areas of virtualization.”
A challenge DHIMS faced was giving clinicians access to patient records while they are in field clinics. When loaded on a desktop PC as a client, the AHLTA program is bigger than Microsoft Office, requiring a lot of power, Weiner said.
To give doctors in remote field units access, AHLTA has been put on servers using Citrix’s XenApp application delivery system, which lets them retrieve medical information from anywhere using any device. XenApp manages applications in the data center and delivers them as an on-demand service.
DHIMS has deployed the technology at Camp Lejeune, where Navy medical personnel can serve the Marines Corps, which does not have its own medical system. There are many small clinics at Camp Lejeune where doctors need access to AHLTA, Weiner said.
“We have a provider who is in a tent in a clinic in the field with the Marine unit,” he said. “We’d like them to be able to pull up all the health care information that has been recorded on that patient.”
DHIMS officials plan to extend the capability to Army, Navy and Air Force reserve units.
“We want to ensure that while those servicemen and women are on active duty, clinicians can pull their data up and document it into the DHIMS system,” Weiner said. For example, a reserve unit from Maine might want to see what care was delivered to its members while they were on active duty. “So wouldn’t it be great for their care provider in the reserve unit to see what was documented?”
Weiner said XenApp is being used at military hospitals in Portsmouth, Va., and Camp Lejeune. However, other services use other virtualization systems on the market, such as VMware’s ThinApp and Microsoft App-V, formerly SoftGrid.
Storage: the last frontier?
Planned maintenance is the primary cause of downtime on a computer system. Storage virtualization is one way to mitigate or eliminate those planned downtimes, some experts say.
“We’ve been doing virtualization of storage through IBM’s Storage [SAN] Volume Controller for at least three years,” said Tony Encinias, chief technology officer of Pennsylvania’s Office of Information Technology.
“Virtualized storage reduces the down time when we have to restore or migrate data offsite,” Encinias said.
Virtualization of storage will a huge future requirement for the Arizona Department of Environmental Quality, Harkin said.
For instance, much of the department’s documentation work — retention schedules and permits — is done on paper. Now the department must digitally store all of that information, he said.
There are many different types of technology that can be used for storage virtualization. The right fit depends on the environment, ATS’ Smid said.
For an agency with a small IT shop that struggles with utilization issues — for example, an IT department does not use all its storage capacity or its storage area is tapped out –- there are relatively inexpensive technologies.
For instance, storage arrays from NetApp and Hitachi can boost the utilization rate and give those agencies the flexibility to move applications from one storage array to another or move one virtual machine to another.
With those solutions, the storage array is used as a front end and everything behind it is virtualized, Smid said. That technique can be effective in a small environment that does not have a lot of transactions that could act as a bottleneck, he said.
For large environment, companies such as EMC work with a storage-area network infrastructure. The company’s approach is to push the virtualization layer to the network, so the burden doesn't weigh on any single storage device. The SAN handles the virtualization, Smid said.
The solution also is more scalable, although it can be more expensive.
“But if you [have] utilization range in the 20 to 40 percent on your storage arrays and you can deploy something where you can double that, the cost of the virtualization infrastructure very quickly pays for itself,” Smid said.