Glenn Weinstein


Government IT to the cloud: private-sector lessons

Business' shift to public clouds can offer a path for agencies to follow

With his “25-Point Implementation Plan to Reform Federal Information Technology Management,” in December 2010, federal CIO Vivek Kundra sent shock waves through the IT world by announcing a “cloud-first” policy for the federal government.

Writing in clear, non-bureaucratic prose, Kundra showed that a new era is dawning in federal government IT.

Point No. 3 of the plan, the cloud-first policy, lays down the gauntlet immediately by comparing commercial and government IT in their ability to build applications that scale quickly. Demand for the 2009 “Cash for Clunkers” program quickly exceeded expectations, leading to unplanned outages and service disruptions within three days of the program’s launch. By contrast, Kundra describes how an unnamed “Web-based multimedia production company” was able to scale a video-sharing service from 50 to 4,000 virtual machines within the same period — three days — to meet demand that was tenfold what the company had anticipated.

Kundra’s implicit taunt — “the private sector can do it, so why can’t we?” — mirrors a typical sales approach used by commercial solutions providers, comparing consumer and enterprise services. The consumer Web, illustrated by the success of Amazon, Google, Facebook, Twitter and others, has raised users’ expectations, and those same users are now increasingly expecting a similar level of integration and performance in the computing services provided to them at work. In other words, “If our kids are enjoying such a great computing experience, why shouldn’t our customers and our constituents?"

Related stories:

Implementing the cloud-first policy? Start with e-mail.

GSA takes the plunge, as first to move e-mail to cloud agencywide

Private-sector IT shops face many of the same cloud adoption barriers that government agencies face. Large companies are entrusted to secure data for millions of customers and have invested years in integrating systems internally and with commercial partners. System availability and performance are paramount; even a few minutes of downtime could translate to significant lost revenues or worse. In the short term, the risks of a dramatic shift in IT architecture may not seem worth the long-term benefits.

However, in the four years that Appirio has provided products and consulting services to large organizations moving to the public cloud, we have witnessed a significant shift in priorities. We encounter far less resistance from the CIOs and IT rank-and-file to the notion of migrating to cloud-based systems. Individual IT practitioners have moved from seeing cloud technology as a threat to viewing it a necessity for career progression.

Public cloud services like, NetSuite, Google Apps, and Amazon Web Services are becoming ubiquitous in the Fortune 1,000 landscape. They offer a proven security model, scalability, and a mature set of technologies that the corporate world has learned it can count on for performance and uptime. Kundra’s call for the National Institute of Standards and Technology to work in 2011 to develop standards for security, interoperability and portability will allow such companies to certify their readiness to serve a greater number of business and government scenarios.

Kundra’s mandate is aggressive. His requirement for each agency CIO to designate at least three “must-move” services to migrate to the cloud mirrors the approach we’ve seen successful corporate CIOs take.

The CIO often is among the first to recognize the long-term benefits of shifting to the public cloud around economics, flexibility and speed to market. But some departmental leaders can work within bureaucratic systems that, ironically, incentivize against optimizing for long-term benefits if they involve short-term risk. By establishing a bold mandate, a CIO can give his subordinates air cover in signing up for transition plans that require moving significant data and processes away from data centers that they implement, manage and maintain.

Moving to the cloud may make for some tough early conversations but will ultimately enable government IT organizations to address problems they have been itching to solve but that have laid dormant due to lack of resources.

Another encouraging aspect of the federal plan is its emphasis on “commercial cloud technologies,” relegating “private” and “regional” clouds only to situations where commercial cloud is not feasible. This aligns with a commonly held vision that so-called private cloud computing is an evolution of virtualization technology within a company’s physical or virtual data centers but should be viewed as a steppingstone toward deployment to public (commercial) clouds.

Ultimately, the IT consumer, whether a company or a government agency, gets the full benefit of pay-per-use pricing, nearly infinite up-and-down scalability and operational expenditures by relying on fewer and larger collections of computing power, shared with very large numbers of other companies or agencies.

Cities such as Washington, D.C., and Los Angeles are proving the viability of the public cloud computing model for essential IT services such as e-mail and collaboration tools, and helping the marketplace identify and prioritize must-have features that will allow such solutions to be deployed even more widely. Between marketplace pressure and forthcoming standards from NIST, vendors will have a clear road map for delivering cloud-based solutions that can apply to the broadest possible set of government agencies.

Some have criticized Kundra’s plan for its lack of specificity. Little distinction is provided to separately consider the types of cloud solutions, categorized broadly as software as a service, platform as a service and infrastructure as a service.  Flexibility is needed in any plan that is meant to provide top-level direction and apply across a disparate set of usage scenarios. Kundra has properly allowed each agency to make choices appropriate to its unique set of factors but has left no room for misunderstanding the overall mandate toward adoption of public cloud computing.

2011 should be a watershed year for public cloud computing in the government sector and for the federal government, as agencies attempt to comply with the CIO’s plan. The plan will create significant new demands, in terms of both scalability and breadth of requirements, on the existing cloud vendors, and perhaps spawn creation of new vendors. Just as the Fortune 1,000 learned valuable lessons and found opportunities from the consumer market, government agencies should consider the experience of the commercial segment in crafting their plans.


About the Author

Glenn Weinstein is chief technology officer for Appirio, a cloud computing provider.

inside gcn

  • blockchain (Immersion Imagery/

    DARPA eyes 'less-explored avenues' of blockchain

Reader Comments

Thu, Jan 20, 2011 Editor

Editor's note: An apparent system glitch initially prevented Mr. Weinstein's byline from appearing at the top of the article, but it has been restored.

Thu, Jan 20, 2011 Cautiously Optimistic Washington DC

I noticed at the top that the authors name was not listed. I figured it was probably a cloud vendor trying to sell services to the US Government. As a read more I guessed it was probably some high level manager from Appirio and at the very end of the article I find “Glenn Weinstein is chief technology officer for Appirio, a cloud computing provider.“ Imagine that.

Anyway, Weinstein has some valid points; but he fails to see the security, retraining, availability and customer implications. We have already seen incidents in the cloud and ultimately the ‘FAIL’ news falls on the corporate face of the company that put their Information (eg. privacy info) in the cloud. Should the Government put unencrypted proprietary planning and acquisition information in a public email or IM cloud? Hint, the answer is a three letter word. The private company hosting the Governments cloud has valuable information in their cloud that could be used to improve their chance of winning more Government contracts.

Ultimately moving to the cloud is a risk based decision. Most legacy systems are running fairly efficiently and don’t need to move to the cloud - not broke, don’t need fixed; let them die a graceful death. Those annoying small low risk non-confidential systems that can’t seem to be shut down (so call craplications) should first be virtualized and then moved to the cloud. Yet at the same time the Government must have requirements for security, performance, SLAs, etc; yet avoid vendor lock-in. Moving some non-core business functions to the cloud is the next ‘logical’ shift after virtualization and enterprise architecture. However, if moving to the cloud means finding a solution that meets the lowest common denomination (i.e. best fit), services will continue to degrade. If I sound like I’m jaded, OK; but I have been around the block before. Remember EA, TQM, Cheaper Better Faster, reengineering, Y2K, outsourcing, too big to fall, insourcing, and now it’s the cloud. I just want it to work! Remember your customers, they just want it to work without consequences.

Hello Open Stack; we are planning to come to the public/private cloud in some planned way. However, when we start to get there we will likely be pulled in the direction of the Next Big BuzzWord . Hey, may I start with suggesting NBB W?

BTW. Most smart managers in IT don’t trust the electric grid, hence we have enterprise class UPSs, and generators for back-up; the communication cloud (Internet plus private diverse circuits, cell phones and Plain Old Telephone); water supply (water cooler and bottled water), etc. We learned lesson from the past. I want to minimize the lessons in my career in IT.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group