Don't look down: The path to cloud computing is still missing a few steps

Agencies navigate issues of interoperability, data migrations, security and standards

The federal government is moving to the cloud. There’s no doubt about that.

Momentum for cloud computing has been building during the past year, after the new administration trumpeted the approach as a way to derive greater efficiency and cost savings from information technology investments.

At the behest of federal Chief Information Officer Vivek Kundra, the General Services Administration became the center of gravity for cloud computing at civilian agencies, with the launch of a cloud storefront,, that offers business, productivity and social media applications in addition to cloud IT services.

High-profile pilot programs generated more buzz about cloud computing, including the Defense Information Systems Agency’s Rapid Access Computing Environment and NASA Ames Research Center’s Nebula, a shared platform and source repository for NASA developers that also can facilitate collaboration with scientists outside the agency.

Related stories

NASA explores the cloud with Nebula

Cloud computing has appeal for Web applications

But the journey to cloud computing infrastructures will take a few more years to unfold, federal CIOs and industry experts say.

Issues of data portability among different cloud services, migration of existing data, security and the definition of standards for all of those areas are the missing rungs on the ladder to the clouds.

“Cloud computing is not a technology that can just be turned on overnight,” said Peter Tseronis, deputy associate CIO of the Energy Department and chairman of the Federal Cloud Computing Advisory Council.

“We spent a lot of last year defining what the cloud is, what are the various delivery models, deployments and characteristics,” Tseronis said. “We still continue to need to do that."

The government defines cloud computing as an on-demand model for network access, allowing users to tap into a shared pool of configurable computing resources, such as applications, networks, servers, storage and services, that can be rapidly provisioned and released with minimal management effort or service-provider interaction.

The three delivery models include:

  • Software as a service (SaaS), which provides business applications running on a cloud infrastructure and accessible on a client device via a Web browser.
  • Platform as a service (PaaS), which is the deployment via the cloud of user-developed applications, such as databases or management systems.
  • Infrastructure as a service (IaaS), which is the provisioning of computing resources for users on an as-needed basis.

The Federal Cloud Computing Advisory Council provided a governance structure last year to disseminate information about cloud computing and its concepts, benefits and risks. The council will continue to raise awareness about the governance structure among agencies, Tseronis said.

But some agencies remain confused about the cloud, Tseronis said.

Agency managers are wondering about security and data privacy risks associated with the cloud. Are there procurement barriers? What is better: a public or private cloud? How do you set up a service-level agreement? What are the data interoperability and portability issues?

Security Struggles

The Bureau of Alcohol, Tobacco, Firearms and Explosives hasn’t launched a specific cloud project, but officials have been evaluating the benefits and risks for more than a year because a move to the cloud seems like a natural fit. “We are already fairly outsourced in terms of our IT infrastructure,” said Rick Holgate, the bureau's CIO.

ATF has dedicated hardware and physical space in two data centers — one government-owned and operated by a contractor, the other owned and operated by a contractor.

However, security is a major concern. Most agencies have concerns about data separation because they want to prevent a commingling of data with tenants in other environments. And they need access restrictions on data to make sure cloud hosting providers or other tenants don’t inadvertently or intentionally get access to sensitive data.

“We are all struggling in the federal space with the right security model around the truer cloud provision capability,” Holgate said.

Despite some progress toward resolving those issues, more work is necessary to hash out security requirements that federal agencies need to follow to ensure that sensitive but unclassified and classified information is secure, Holgate said.

First, cloud providers need to understand government security requirements and deliver services that satisfy those requirements. Microsoft recently created a federal version of its Business Productivity Online Services for the cloud, which is one example of how vendors could help address security requirements, he said.

On the federal side, “we need to probably do a better job of articulating what those requirements are from a security perspective,” Holgate said.

The federal government still has a fragmented approach to security, he said. “We don’t have a single, unified — to my knowledge — federal voice that everyone has agreed to and signed up to as the authoritative version of what the federal government considers sufficiently secure in a cloud-type environment,” he said.

GSA and the National Institute of Standards and Technology have been addressing security requirements, and the Justice Department tackled the problem at a department level, Holgate said.

“But there isn’t a clearly articulated checklist [that says] if you meet these 10 or 12 requirements, we’ll consider you a secure cloud provisioning model,” Holgate said.

Microsoft is trying to address some of those concerns by working with GSA to ensure that its cloud hosting model complies with the Federal Information Security Management Act.

“It is just a question of whether what [Microsoft] is putting forward as a federal version of the cloud is sufficient to meet the federal government’s requirements,” Holgate said. “And in the case of doing the FISMA compliance with GSA, can the rest of us accept GSA’s accreditation of that model?”

Standard Controls

The Federal Cloud Computing Security Working Group, an interagency initiative, is working to develop the Government-Wide Authorization Program (GAP), which will establish a standard set of security controls and a common certification and accreditation program that will validate cloud computing providers.

“The lack of a governmentwide authorization program is hindering federal adoption of cloud computing,” said working group chairman Peter Mell, who spoke recently at the Institute for Defense and Government Advancement’s Cloud Computing for DOD and Government conference in Alexandria, Va.

Cloud vendors need to implement multiple agency policies, which can translate into duplicative risk management processes and lead to inconsistent application of federal security requirements.

As a result, Kundra wants the Cloud Security Working Group to move quickly on this program, Mell said. Work began on a governmentwide risk management process, which included security authorization last year. The Cloud Executive Steering Committee approved a draft design of the document in January.

The Cloud Security Working Group developed cloud computing security requirements that use NIST Special Publication 800-53, released in February. NIST continues to work with cloud vendors to further refine a risk management framework for cloud computing.

Mell said the working group is trying to find an agency to run GAP. The goal is the rapid development of interagency, vetted security requirements for cloud systems that can be used for risk management, including assessment, authorization and continuous monitoring.

Those cloud vendors can be software-, infrastructure- or platform-as-a-service providers or even solutions that are offered governmentwide, such as NASA’s Nebula, Mell said. Eventually, cloud computing providers will implement the federal security requirements for the cloud and have a third-party assessment of that implementation.

Vendors can then deliver an authorization package to GAP, get interagency validation and receive a logo that states that they have been authorized.

GAP is not meant to take away agencies’ authority, Mell said. Agency managers do not have to use the validated cloud vendors. Agency officials are responsible for determining whether the validation meets their needs. GAP is intended to be a tool that can help agencies speed the acquisition of cloud solutions, Mell said.

Managing Identities

At the user level, there are challenges associated with access control and identity management, said Doug Bourgeois, director of the Interior Department’s National Business Center, which provides cloud computing services to agencies. NBC offers federal agencies access to government financial management systems, human resources packages, acquisition automation and other enterprise applications.

“We implemented as part of our cloud a federated identity management module,” Bourgeois said. “Things can be solved, but there are no clearly defined federal standards for these things"

Organizations must extend their existing identity, access management, audit and monitoring strategies into the cloud. However, the problem is that existing enterprise systems might not easily integrate with the cloud, said Adam Vincent, chief technology officer for the public sector at Layer 7 Technologies.

With the company’s SecureSpan Networking Gateway, internal enterprise resources are accessible from the cloud and, in turn, a virtual gateway in the cloud can secure applications and connect to enterprise systems.

In addition to security, data interoperability is another step that has to be put in place.

The cloud is still missing data interoperability, said Venkatapathi Puvvada, vice president and managing partner of Unisys’ Federal Horizontal Services.

An agency cannot transfer data from a public cloud provider, such as Amazon or Google, and put it in an infrastructure-as-a-service platform that a private cloud provider develops for the agency and then exchange that data with another type of cloud provider, Puvvada said.

That type of data transfer is difficult because there are no overarching standards for operating in a hybrid environment, he said.

Who has custody and control of the data if your agency uses multiple providers? The cloud computing industry hasn’t resolved those ownership concerns, and it doesn't know what standard will emerge to handle information exchanges, Puvvada said.

But that doesn’t mean each agency should have its own secured private cloud, Tseronis said.

“You want to have this portability or interoperability based on standards,” he said. This is why the Federal Cloud Computing Initiative has established working groups for security, standards, communications and outreach, and operating excellence, he said.

Those groups are working with standards developers, such as the Institute of Electrical Electronics Engineers and NIST, that deal with Internet standards. They are focused not only on standards for cloud computing but also interoperability with standards for cybersecurity, Homeland Security Presidential Directive 12 and IPv6.

“The last thing we want are separate clouds that are built on infrastructures that do not talk to one another or [have] languages or protocols that won’t speak” to one another, Tseronis added.

Standards generally trail the technology, which is not necessarily a bad thing, said Stan Freck, director of cloud computing at Microsoft’s U.S. Public Sector. By waiting for standards, companies tend to make incremental steps rather than revolutionary change, he added.

Nonetheless, vendors are trying to apply existing standards to cloud services when they design on-premise solutions, Freck said.

The consortium of government entities and commercial suppliers needs to come together and evolve standards to have a flavor that is cloud-specific, he said.

Many organizations are focusing efforts on interoperability standards to forge data exchange among various types of clouds.

For instance, the Distributed Management Task Force, a consortium of IT companies focused on systems management issues, is working on the Open Cloud Standards Incubator to facilitate interoperability between public and private clouds. Members include companies such as Advanced Micro Devices, CA, Citrix Systems, Hewlett-Packard, IBM, RackSpace and VMware.

The aim is to develop cloud resource management protocols, packaging formats and security mechanisms that facilitate interoperability.

Other organizations working on interoperability standards include the Network Centric Operations Consortium, Object Management Group, Open Grid Forum, Organization for the Advancement of Structured Information Standards, and Storage Networking Industry Association,

In addition, the cloud computing industry needs standards for securely exchanging information among clouds.

“What if I have something running in Microsoft Azure, another capability in Amazon Elastic Compute and something else in Google’s environment, and I’ve mashed them together somehow," said Bob Gourley, chief technology officer at Crucial Point. "How can I as a government user have confidence that the data exchange between them is going to be secure?”

That standards issue depends on getting providers to agree on how to exchange data in the right format, Gourley said.

Migration Mysteries

Agency officials cannot afford to ignore the movement to the cloud, especially because the Obama administration has mandated that agencies look for greater efficiencies using cloud computing.

As a result, agencies should start to develop a cloud strategy and identify candidates for pilot projects, experts say. Tasks that are well suited for the cloud include software development, e-mail, collaboration and social media software, content management, and Web portal environments.

“One of the biggest things I think about is moving legacy applications to the cloud,” said John Shea, director of enterprise services and integration at the Office of the Deputy Assistant Secretary of Defense.

“I believe we don’t have one thing that is legacy applications; we have maybe five or six categories,” and each group requires a transition strategy, he said at the recent IDGA cloud computing conference. He said his group might be able to apply software wizards to facilitate the transition for some applications.

ATF and other federal organizations will run into challenges when they try to extend mission-critical systems to the cloud. Most of those applications are not built with a common architecture. They use diverse technologies and don’t lend themselves well to rehosting.

“When you look at getting those legacy systems rearchitected and re-engineered to be amenable to deployment in a cloud type of environment, that is more of a substantial technical issue,” Holgate said.

“We haven’t necessarily made the investment on an ongoing basis of keeping our mission support, legacy systems current with technology so they can be easily moved into that kind of hosting environment,” he said.

“There are a lot of unknowns in how you move legacy systems into a cloud hosting environment,” Holgate said. ATF would need to examine the structure of agency’s mission application architecture in a cloud environment, Holgate said.

Many other agencies are probably in a similar situation, experts say.

The cloud has many potential uses, NBC’s Bourgeois said. It’s fairly easy to start a new development project in the cloud. It’s fairly simple to use an existing cloud-based service for collaboration and Web e-mail. But it’s trickier to use cloud services for storage and disaster recovery or migrate static Web sites to the cloud. Although there are no standards, third-party plugs-in are available to aid with the migration.

However, migration from existing production resources to the cloud is more complex, and it will take time for the technology to catch up, Bourgeois said.

“I’m hearing that there are products that claim they will encapsulate data and move it to the cloud using the [DMTF] Open Virtualization Format,” Bourgeois said. However, for the most part, the standards for data portability between the data center and cloud are still not here yet, he added.

That's why federal agencies should consider private cloud models first, said Brooke Guthrie, manager of product and business management at CDW Hosting and Managed Services.

Private cloud models enable agencies to maintain the security of sensitive data. Within a private cloud, federal agencies should decide which systems to migrate based on their performance requirements.

Agencies also should consider which systems are mission-critical and test to make sure those systems will perform well in a cloud environment. They need to begin application migrations with smaller data components rather than larger, image-heavy data pieces, Guthrie said.

To rearchitect applications in the cloud, CDW-G recommends that agencies consider middleware. Those applications can help agencies tune data storage and retrieval, which can help improve system performance in the cloud, she said.

Finally, agencies must prioritize project management. Determining a cloud project's structure, testing it, and creating a template will give agencies a template to use to repeat a successful cloud model for future projects, Guthrie said.

For the most part, the push this year in the federal sector will be to establish some more tests. CIO and IT managers will also need to start applying cloud computing to the mission of their agencies, Tseronis said.

Agency officials have already been informed that they need to plan for the cloud in their fiscal year planning for 2013 and 2014 as it relates to their IT investments.

“It is not an all-or-nothing paradigm, but it needs to be considered,” Tseronis said.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected