Out of one, many

There are two basic paths but many steps to making the most of server virtualization.<@VM>Next up: Desktop virtualization

Resources: Virtualization

Here are links to a sampling of major vendors' offerings and information on virtualization. For Web pages with long URLs, we've shortened the route with GCN.com/numbers that you can enter at GCN.com.

Cisco Systems'GCN.com/907

Citrix Systems'GCN.com/908








Red Hat'GCN.com/914

Sun Microsystems'GCN.com/915


Checklist: Evaluating virtualization

1.) Define your goals for virtualization ' isolation, consolidation, migration.

2.) Gather server utilization statistics.

3.) Assemble application metrics and service- level runtime requirements.

4.) Estimate virtualization resource needs for new and existing workloads.

5.) Research platform choices to ascertain the most suitable physical hardware.

6.) Determine which guest operating systems you need to support.

7.) Decide whether a 32-bit or 64-bit architecture is most appropriate.

8.) Detail virtualized resource requirements ' processors, memory, dynamic, fixed.

9.) Review topology and scalability requirements ' clustering, load balancing.

10.) Determine the cost to support your virtualized environment.

If you're evaluating ways to increase utilization while reducing computing costs, virtualization should be near the top of your list.

Virtualization ' the logical abstraction of computing resources ' is most commonly implemented at the server level. Server-based virtualization enables a single physical machine to concurrently run multiple operating systems and workloads. (To read GCN's Jan. 7 cover story on the trend toward virtualization, go to GCN.com/930.) Widely implemented in the private sector and with plenty of products available, server virtualization helps organizations reduce costs through consolidation, which in turn lowers other expenses, such as power consumption and the need for data center floor space.

The benefits of virtualization go beyond simply reducing the number of servers. A virtualized environment can increase efficiency because all available computing resources ' processors and memory ' are being used.

Moreover, virtualization enables more robust and reliable topologies.

Many organizations are using virtualization to support disaster recovery capability without the cost associated with maintaining separate physical hot- or cold-standby server configurations.

Organizations are also using virtualization in dynamic-development environments that can be created for the duration of a given project and then collapsed, saving once more on computing resources.

The heart of virtualization technology is the hypervisor ' a thin layer of software that usually runs on the hardware, where it intercepts some or all of the operating system's calls to the hardware. Typically, the hypervisor will virtualize processor and memory resources while the hosted, or guest, operating systems virtualize other resources, such as networking and storage.

There are two types of hypervisor technology.

Type 1 hypervisors ' also called native or baremetal ' run directly at the hardware level while the hosted operating systems run at a second level above the hardware.

A Type 2, or hosted, hypervisor is software that runs in an operating system. In this scenario, the hosted or guest operating system runs at a third level above the hardware.

Virtualization is not new technology. Hypervisors originated in the 1960s in IBM mainframe systems, such as the System/360.

What is new is that virtualization technologies are now available for nearly every operating system and other types of technologies. Although the available virtualization solutions now number more than 50, their features vary widely.

For example, some solutions, such as those provided by VMWare, support automated machine restarting, live migration of virtual machines from one physical server to another and load balancing of virtual machines across multiple physical servers.

Other virtualization solutions, such as those from IBM, let you virtualize processor resources at a fractional level, and some other solutions can only virtualize entire processors. In short, you'll need to document your agency's requirements in detail before evaluating virtualization solutions to determine which is the most appropriate.

IBM customers running System i and System p, formerly iSeries and pSeries, have been able to use the company's Type 1 hypervisor technology to run multiple operating systems ' such as i5/OS, AIX and Linux ' concurrently for some time now.

Other Type 1 hypervisor virtualization solutions include VMWare's ESX Server, the opensource project called Xen and Sun Microsystems' Logical Domains, which has been available for about two years. More recently, Microsoft has entered the Type 1 hypervisor fray with its Hyper-V solution. Hyper-V is available in beta test form but is expected to be part of Windows Server 2008 when it is released.

Type 2 hypervisor solutions that support virtualization include VMWare Workstation, VMWare Fusion, the open-source project QEMU, Microsoft Virtual PC and Virtual Server, and Parallels' Parallels Workstation and Parallels Desktop.

Generally speaking, Type 1 solutions are preferable for server virtualization because they achieve greater efficiency through a more direct link with the hardware. Type 2 hypervisors can most often be found on client desktop PCs and workstations or machines where support for a broad range of input/output devices is needed.

Checking your list

Before you can start a detailed evaluation of virtualization products, you'll need a checklist that fully outlines your agency's requirements.

To start, ask yourself what you are trying to achieve.

Aside from server consolidation, virtualization frequently is applied to workload isolation or software migration. You also could choose to isolate workloads in separate virtual machines to increase performance. Likewise, you may want to bring up a new version of a piece of software, such as an application server, in a separate virtual machine to gauge what's required to migrate from an existing version of the same software.

Once you know your goal, the next step is to gather some metrics. Starting at the server level, pull statistics that show how your processors, memory, networking and storage are being utilized. Pull at least three months' worth of server performance information ' more if you can ' so you can note utilization trends over time.

Moving above the server layer, gather performance metrics for the applications or services you are considering for virtualization.

Again, three or more months of data will help you see the resources ' memory, for instance ' each application or service is using and what service-level metrics, such as response time, are needed.

Next, turn your attention to platforms. Your choice of physical hardware is crucial because it defines what you can and can't do with virtualization. For example, if you were to implement Sun's Logical Domains using the company's hardware, you would be able to host Solaris, Linux and FreeBSD operating systems. If you implemented Parallels Workstation using Intel hardware, you could host Solaris, Linux, FreeBSD in addition to Windows, eComStation, MS-DOS and OS/2.

It is also important to define your architecture.

Some virtualization solutions are limited to 32-bit architectures only, and others support both 32-bit and 64-bit architectures. Your choice of architecture likely will come down to how many virtual machines you plan to host along with your expectations for performance and scalability.

Using the metrics you gathered earlier, estimate the number of virtual machines you'll need to meet your goal. If you need to facilitate greater developer productivity by enabling employees to run a few virtual machines on their workstations, you can likely use a Type 2 hypervisor solution together with added processors and memory where hardware configurations warrant it.

Conversely, if you plan to host hundreds of virtual machines, you'll want a Type 1 hypervisor solution and need to invest in higher-end hardware.

Beyond the number of virtual machines, you'll need to estimate the resource requirements for each of them. You can arrive at a fairly accurate number by using the metrics you gathered earlier for existing workloads. For new workloads, you'll need to examine the minimum server requirements needed to run the software you plan to virtualize.

A compelling feature in some virtualization solutions is the ability to dynamically allocate resources based on schedule or the performance characteristics of multiple workloads across multiple virtual machines during the runtime.

This capability enables an agency to dynamically shift processors and memory to closely match requirements, fully utilize all of the available hardware and scale virtual configurations over time to meet growth needs. Be wary of solutions that want you to predefine resource needs in a fixed manner because many of them will not scale.

List the type of topologies you want to support, too. Will all of your virtual machines run standalone? You likely will want to cluster virtual machines to ensure uptime, and you may need to have support for load balancing and automated failover functions. Define upfront how you plan to implement these capabilities in a virtualized environment.

Finally, cost is an important part of your checklist. Virtualization solutions range from free to thousands of dollars. Be sure to estimate the amount you want to invest in virtualization upfront if you are running the data center within the agency walls.

Expanding directions

Virtualization doesn't end at the server. The same type of technology is being used in shared-storage configurations, such as storagearea networks, to segment back-end databases from one another. Moreover, many solutions support virtualized networking functionality, and virtualization has even begun to make its way into embedded devices.

In the future, you can expect to see virtualization blend with constructs such as grid computing to enable seamless virtualization across distributed groups of machines.

This topology will enable better workload management and boost server uptime even further.

It can also be a foundation for forthcoming business computing trends, such as cloud computing, using the Internet as a platform and software as a service.WITH SERVER VIRTUALIZATION gaining popularity in government information technology shops, are other approaches far behind? Desktop and hardware virtualization, for example, could become more popular this year, said Vic Berger, a technologist at CDW-Government.

Citrix's purchase last year of XenSource, a provider of open-source software, will likely boost desktop virtualization, Berger said. 'Desktop virtualization is going to make a dramatic change in how [organizations] can present that desktop environment to the user,' he said. Desktop virtualization allows servers to host desktop environments that can be tailored to meet the needs of specific users.

VMWare is also making a big push into desktop virtualization. The company's ACE, an enterprise tool that can provision PCs inside virtual machines, provides more portability through secure, virtualized desktops that can be carried on a USB thumb drive and deployed on any PC.

'The ability to boot a desktop operating system off a small, external flash drive and be up and operating in 30 seconds' will be a powerful performance booster for organizations, Berger said.

Hardware virtualization, which gives organizations the ability to access and provision a pool of server processing, memory and input/output resources from a single interface, also is gaining momentum.

Egenera, for example, is working with agencies such as the Defense Department's Military Health System and the Census Bureau on server and data center consolidation efforts. A virtualization approach to hardware would represent the next step toward true grid computing, Berger said.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected