The year of virtualization

Sure, it lets you do more with less, but it's not easy. Here's what you need to know

Thinking of moving to a virtualized computing environment? This might be a good year to do it.

Agencies increasingly are turning to virtualization technology to consolidate servers and data centers to reduce server sprawl, computing costs and power consumption.

Moreover, agency information technology managers are being forced to meet growing computing demand with less money and staff as more workers retire.

Virtualization can make a single physical resource, such as a server, operating system or storage device, appear to function as multiple resources. Or, it can make multiple physical resources appear as a single resource.

Server virtualization, which is the ability to run multiple instances of operating systems concurrently on a single hardware system, is gathering momentum in the government sector.

To date, the Office of the Secretary of Defense has deployed VMware's virtualization technology to collapse 30 servers into one, using it as a strategy for disaster recovery, storage backup, standardization and security.

The Marine Corps also is implementing VMware products for server consolidation with the goal of reducing its physical infrastructure from 300 data centers worldwide ' which includes about 12,000 physical servers ' to 30 data centers and 100 mobile platforms.

At the state level, Oregon has successfully completed a multiyear network consolidation of 11 agency data centers into a single facility in an effort to improve service, productivity and energy consumption, state officials say.
Another indicator that the technology is a hot ticket is increased activity by vendors such as Cisco Systems, Citrix, Oracle and Sun Microsystems. VMware, a leading provider of virtualization technology, had a breakout year in 2007. And Microsoft is poised to release Windows Server 2008 with built-in virtualization capabilities.

All this activity means more choices and flavors of virtualization from which users can choose. However, virtualization brings its own management, performance and security challenges. If your agency is embarking on a virtualization deployment or considering it, here are a few things that you should know.

1. Not just for servers

Although server virtualization is gaining momentum among defense and civilian agencies for consolidating servers and data centers, it is only one layer of the technology, experts say. It can be applied across applications, desktop PCs, storage systems, network infrastructure and servers.

The concept has been around for years, emerging during the days of the IBM mainframe when the large machines ran different parallel operating systems or instances of operating systems. IBM, Hewlett-Packard and Sun then led the way in putting virtualization capabilities into their Unix-based operating systems.

'What we're seeing now is the computing power available ' even down to the laptop level ' is great enough that you can run multiple different workloads on a single piece of hardware,' said Roy Campbell, senior technology strategist at Microsoft Federal.

'That single piece of hardware is going to be easier to manage from a space, heating, cooling and cabling perspective,' Campbell said. 'The downside with multiple virtual machines running on a piece of hardware [is that] the need for effective management becomes that much greater.'

In addition to consolidating servers and data centers with its deployment of VMware ACE, the Marine Corps will provide users portability through secure, virtualized desktops that can be carried on a USB thumb drive and deployed on any PC. Mobile combat units would be able to access computing environments on the fly from any location.

The Marines do not have a homogenous operating system environment, said Maj. Carl Brodhun, who works at the corps' Systems Command. 'So that drives us to a solution set requiring virtualization at the platform level to start with ' particularly on the server side.'

The Marines intend to expand beyond the virtualization of the data center infrastructure to incorporate virtual appliances. This will give users an operating system, applications and the operational characteristics of those applications within a virtual container. The goal is to rapidly redeploy applications in case of failure and remove applications from a given environment, he said.

'Additionally, the virtual appliance [would] support Systems Command's requirements to position advanced leading-edge technologies, and particularly application functionality, with the warfighters as quickly as possible,' Brodhun said.

In the next phase, the Marine Corps will move toward client-side virtualization.
Virtualization has a critical role in Oregon's consolidation efforts. The state uses virtual local-area networks, but on its wide-area network, it implemented Multiprotocol Label Switching, which is not advertised as a virtualization technology but gives capacity-on-demand to various agencies, said Mark Reyer, the state's data center administrator.

Taking a step further, Oregon's server farm will not have storage systems attached directly. Data center officials are implementing a separate farm attached via Cisco's Virtual Storage Area Network, which provides a way to group storage systems into a logical fabric using the same physical hardware infrastructure.

'We boot from the SANs, not hard drives,' Reyer said. 'So that gives the state maximum flexibility to have capacity on demand on the server farm and rapidly provision without regard to where the data stores are located.'

Desktop, hardware and open-source virtualization are other areas likely to gain attention this year, experts say.

2. Management is critical

Users will need proper monitoring and alerting tools to manage their virtualized environments, experts say.

Some of the drawbacks with virtualization involve virtual machine sprawl, said Jason Langone, chief VMware architect at the Infinite Group, an integrator and consulting firm that includes the Homeland Security Department and U.S. Customs and Border Protection agency as customers.

'This is often a problem with organizations new to virtualization technology, where it is so easy to add another virtual machine,' Langone said. 'More mature organizations are looking at more sound processes which take into consideration capacity planning and server allocation.'

Virtualized environments can bring their own level of complexity, Campbell said. 'Virtualization is not about simplicity. Virtualization is about extending the capabilities of existing systems, doing more with less.'

Management could be an increasing concern because the market has grown from a handful of vendors to about 17 enterprise-level players, said Andrew Cathrow, product line manager at Red Hat virtualization.

An agency might wind up with five or six different flavors of virtualization, Cathrow said, with one set of products on a mainframe, another on PCs, another on servers and so on.

The challenge would be finding a management approach that spans all those areas so IT administrators can holistically manage the entire enterprise.

In one approach, Red Hat began a project two years ago called Libvirt, an effort to create an open-source application programming interface that will let disparate tools work together to manage virtualized environments.

3. Pick your spots

There are good places and bad places for virtualization.

'I do not subscribe to the virtualize-everything model,' Campbell said.
Virtualization primarily is not suited for high-end workloads and less mature areas of the industry where the complexity could be a real killer, he said.

Oregon's Reyer noted that because of federal or state regulatory requirements, some applications might have to be secured on a separate box. Applications that have to adhere to privacy, education, medical or revenue regulations would fall into this group, he said.

'But we've also found that those types of workloads that are processor-intensive are not as amenable to virtualization,' Reyer said. 'It might be that we are new to this and haven't figured out how to tune them yet.'

Like many state government agencies, Oregon has deployed geographic information systems, which are processor-intensive. Throughput and response time for these systems are better on stand-alone machines, he said.

Database engines also might not be suited for virtualization, Reyer said. Some of the database engines are big enough that they require a stand-alone environment. But if they don't have those high-usage, high-throughput requirements, IT managers will virtualize database engines, too.

E-mail is another area that will not be virtualized. However, although many experts agree that e-mail servers such as Microsoft Exchange might be too large for virtualization, components such as collaboration features are good candidates.

Agencies considering virtualization should understand what their applications do ' their requirements and deficiencies, said Stan Tyliszczak, senior director of technology integration for the chief technology office at General Dynamics Information Technology.

'Part of what may happen when you go into a virtualized environment is you may have to go through additional security testing, security certification and accreditation,' Tyliszczak said. 'It might be required or might not be required. It depends on the location of the application and how much it is tied to the hardware.'

4. New licensing model

Virtualization and multicore computing are changing traditional software licensing models. Server software is typically licensed per socket or per CPU. Under this scenario, if an organization runs multiple instances of a single program on a single server, many vendors require users to pay for each copy of the program, whether it is virtual or not.

However, as more organizations adopt virtualization, companies such as IBM, BEA Systems and even Microsoft have been forced to move to licensing based on the number of virtual processors or sockets an application instance uses rather than the number of physical processors or sockets.

Virtualization could become a nightmare for software vendors that charge by CPU, said Bill Vass, president at Sun's federal subsidiary.

Last year, a hot item was 32-way CPUs in servers; this year, the rave is 64-way. Next year, it will be 128-way CPUs, he said.

How do you keep track of that in a virtualized environment? 'You won't even know how many of those CPUs you'll be using at any one time with each application,' Vass said.

For its part, Microsoft offers virtual products for its operating systems ' Virtual PC 2007 and Virtual Server 2007 ' that are available for free with a valid Windows license.

Windows Server 2003 Enterprise Edition allows users to run as many as four virtual guests for each valid copy. Windows Vista Enterprise Edition offers two virtual guests, and Windows Server Data Center Edition permits an unlimited number of virtual guests.

The company is expected to follow a similar strategy with Windows Server 2008, which includes Hyper-V technology, in which the virtualized server is isolated from other servers and from the operating system core. With Windows Server 2008 Standard, users can run their physical server and one virtual server on the physical host without an additional charge.

Vass said Sun moved to a licensing model that charges by the number of employees to whom it is delivering services, because it is easy to find the number of employees in an organization. You always know how many employees are in an organization.

Basically, it is a model for accommodating unlimited users, he said. For instance, if a com- pany employs 5,000 people, Sun would charge them for the 5,000 users, even though the company might have a Web site that has a million users on it.

Whether other companies will adopt this approach remains to be seen. However, licensing should not hamper the implementation of technology, he said.

5. Don't lag on security

Virtualization has inherent security concerns that people might not recognize, experts say.

One critical area involves patch management for software updates and fixing application vulnerabilities across the enterprise.

When you have 200 physical servers, it's easy to track which must be patched, Cathrow said.

But in virtual environments, there is no longer a one-to-one ratio ' you might have five to 20 virtual machines on a host. Tracking those from a patch management or security compliance point of view becomes a lot trickier, he said.

Some security steps, such as full-disk encryption, that rely directly on hardware simply don't make sense for virtual machines, Campbell said.

'The idea of full-volume encryption on a virtual machine that lives in a file on a physical server doesn't really make sense because the access to that file is controlled beyond the virtual operating system,' he said.

The application of intelligent encryption is a remedy that would let agencies encrypt data in storage only when it is used, said Roy Stephan, director of cybersecurity at Intelligent Decisions, a systems integrator that specializes in the federal government. Non-use storage allocation remains unencrypted and open to optimization processes.

Other security concerns surround the ease with which a virtual machine can be moved. If a hard drive is removed from a physical server, somebody is going to know.

On the other hand, someone could take a copy of a virtual machine while the system is still running and move the copy elsewhere for off- line attack or another use without as much possibility of detection, Campbell said.

To address that problem, agencies need operational discipline and better tracking of systems and people. Agencies' managers should know what services are running on their systems and their performance profiles.

'Organizations that haven't taken the due diligence to track those things are probably not the best of candidates to start using virtualization,' Campbell said.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected