Internaut | Virtual IT helps make do with less
- By Shawn McCarthy
- Jun 22, 2006
Shawn P. McCarthy
As government data centers strive to cut costs by consolidating both software licenses and hardware platforms, 'virtualization' has become a common buzzword. But what does virtualization really mean, and how can government IT managers take advantage of the technology? More important, what hardware and software commitment is needed?
First, let's settle on a definition of virtualization that's relevant today. Don't confuse the older concept of server, operating system or application virtualization with a relatively recent, derivative version of the word. The older definition (which is still in broad use) describes the creating of a system within a system, such as partitioning hardware into multiple virtual machines or simulating one operating system or application within another system.
The new virtualization, as it's currently used by IT service providers and system planners, means consolidating similar system resources into resource pools that can be ramped up or down as demand dictates. For example, logical volume management tools can combine multiple disks across a network into one large logical disk. Likewise, RAIN (redundant array of independent network) interfaces can combine multiple network links in a way that makes them function, when needed, as a single, higher-bandwidth link.
This concept of resource virtualization is focused squarely on applications, which is especially important as government consolidates its applications across multiple departments or agencies. Application virtualization environments now employ a progressive technique called strong-thread migration. It allows a computing process to be directed to a new resource at any time. Upon its arrival at the new machine, the process will continue to execute where it left off.
Often, a virtualized application runs in its own virtual environment. This environment represents a distinct layer between the application and the operating system and deals with the registry entries, files and specific components the application requires for execution. A side benefit of this relationship is the elimination of application conflicts and application-specific OS conflicts.Most likely targets
At government agencies, financial, human resource and supply chain management applications are the most likely targets for virtualization. But doing so requires forethought.
System responsiveness and quality of service are always major concerns when making a transition to virtual environments. These can be overcome with realistic planning, and by developing a map of how and when resources are needed. Companies recommend you study application usage well in advance to determine peaks and valleys.
IBM Corp. has long been a pioneer in the concept of virtual machines and applications. Hewlett-Packard Co. and Sun Microsytems have also made significant inroads into the technology. But current adaptation is being driven by IT systems integrators, following the lead of agencies who want to reduce their machine overhead and the number of software licenses they require.
One IT manager in Miami-Dade County, Fla., recently told me he's consolidating and virtualizing several applications on a scalable blade system. He tries to maintain processor power about 10 to 20 percent above his current processing needs, allowing for quick scalability when needed.
At this point, the federal enterprise architecture barely touches on the concept of system virtualization. Some mention of virtual components is made in the CIO Council's 'Services and Components Based Architectures' paper issued earlier this year. Hopefully there will be more of a virtualization focus in the new IT Infrastructure Line of Business, introduced by the Office of Management and Budget last February.Former GCN writer Shawn P. McCarthy is senior analyst and program manager for government IT opportunities at IDC of Framingham, Mass. E-mail him at firstname.lastname@example.org.