Dark cloud: Study finds security risks in virtualization

Government IT upgrade projects may soon have a new wrench thrown into the works. According to recent research from Gartner, 60 percent of virtual servers are less secure than the ones they replace.

The situation is slated to continue through the end of 2015, when the number of insecure virtual servers is expected to drop to 30 percent.

"Virtualization is not inherently insecure," said Neil MacDonald, Gartner fellow and vice president. "However, most virtualized workloads are being deployed insecurely. The latter is a result of the immaturity of tools and processes and the limited training of staff, resellers and consultants."

Numerous state, local and federal agencies have moved or are moving to virtual servers, including the state of California and the Energy Department. While Gartner estimated that only 18 percent of enterprise data center workloads had been virtualized at the end of 2009, that number is expected to grow to more than 50 percent by the close of 2012.

One of the major causes of this issue is a lack of involvement of the IT security team in the architecture and planning stages of development, Gartner said. About 40 percent of the surveyed organizations had not brought security professionals into the projects.

Related coverage:

IBM launches public cloud service

Agencies help test cloud-based file storage system

Another risk is that the virtualization layer could compromise all hosted workloads, with hackers already targeting this layer, Gartner said. Gartner recommends keeping the layer as “thin as possible, while hardening the configuration to unauthorized changes."

Organizations should not rely on host-based security controls, the report states.

Other risks include a lack of visibility and controls on internal virtual networks, which are not visible to network-based security protection devices, such as network-based intrusion prevention systems, and consolidations of workloads of different trust levels on the same physical server without adequate separation. There is also the potential for inadequate administrative access controls and administrative tools for the hypervisor/virtual machine manager layer. Finally, a potential loss of separation duties for network and security controls could lead to inadvertently allowing users to gain access to data that exceeds their normal privilege levels.

To address these risks, Gartner recommended treating the virtual network as similar to a physical one, with the same kind of monitoring and separation of workloads and the same team handling both. Additionally, organizations should isolate virtual desktop workloads from the rest of the physical data center and restrict access to the virtualization layer.

About the Author

Kathleen Hickey is a freelance writer for GCN.

inside gcn

  • health data

    Improving the VA patient journey with data transparency

Reader Comments

Fri, May 7, 2010 Ryan Denver

Getting all groups involved early and often is crucial to closing gaps (security, process) in any project. Virtualization is no different; in fact a large scale virtualization project will expose those flaws pretty early, because it naturally cross’s so many groups’ traditional technical boundaries. Network, storage, windows/unix, security all need to be on the same page.

Fri, Mar 19, 2010 M Reston, VA

This issue more than any other shows how unqualified is the Fed's current IT leadership. Their heads are mired in 1989. If you can deal with the security challenges of the Internet, go work at a help desk somewhere. Security solutions have increasingly come at the direct expense of potential productivity. The IT emperors have no clothes!

Fri, Mar 19, 2010 JDBailey USA

RFC/Vet the information below. Associate: Infrastructure:Internet:Virtualization::Services:WWWeb:Cloud The cloud-services present much the same problems as WWWeb-Services. Virtualized-Infrastructure has much the same problems as the Internet. Cloud-services + Cyber-attack, are not Virtualized-infrastructure + Physical-disaster. Security, quality, and reliability for servers has always been mostly physical. Security, quality, and reliability for servers (infrastructure) has always been mostly physical. Security, quality, and reliability for Web/Apps (services) has always been mostly cyber-security. I am not a certified whatever; So, IMHO, Virtualization done well, should be infrastructure easier, is best. Responsibility and cost for cyber-security (cloud), I suspect, will remain about the same.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group