Containers wait on sidelines for expected government uptake
- By Carolyn Duffy Marsan
- Jan 14, 2015
While still scarce in government data centers, containerization is a cutting edge technology that eventually promises to make dramatic improvements in the efficiency of cloud-based applications, according to its public sector developers.
Containerization is a standards-based approach to packaging application code that allows software developers to re-use code and create applications that are easily ported to different operating systems and devices.
The approach could make infrastructure-as-a-service more efficient because developers can put more containers on a physical server than is currently possible when running virtual machines.
Containerization is early in its development and is just starting to be used in pilot projects in government. Early adopters include developers of consumer apps such as Yelp, Spotify and eBay. The approach is also being deployed by infrastructure vendors such as RackSpace, which is using the technique for its email service.
Experts believe containers eventually will be deployed in public and private clouds used by public-sector organizations.
“The challenge with containerization is that it’s brand new. It’s not ready for enterprise, mission-critical apps,” said Susie Adams, chief technology officer of Microsoft Federal. “Why Microsoft has invested in it is that it looks really promising. If you can pack more virtual images into a physical server, you can better use your resources. But there is no enterprise-grade management system available yet.”
The leading containerization company is Docker, a two-year-old startup that has raised $66 million in venture financing. Docker has gained momentum since it began offering its platform for free under an open-source license, attracting support from Microsoft, Amazon AWS, Google, VMware, IBM and RedHat.
How it works
Docker consists of two parts: Docker Engine, a lightweight portable runtime tool, and Docker Hub, which is a cloud-based service for sharing applications. Docker Engine has been downloaded 100 million times, and 45,000 Docker applications are located in Docker Hub – a sign of the interest in the technology.
Docker applies the concept of the shipping industry’s containers to software. A Docker container is a standard way for an application to identify its infrastructure requirements. This approach allows developers to worry only about what is inside the container, while infrastructure operators worry about delivering additional IT resources that the application needs.
Docker containers are designed to be more portable and efficient than virtual machines. While virtual machines consist of an application, binaries, libraries and an operating system, a Docker Engine container includes just the application and its dependencies, running on the host operating system and sharing the kernel with other containers.
“Containerization is really operating system virtualization that is more efficient than the typical approach, and it’s more application focused,’’ said Bill Kleyman, national director of strategy and innovation at MTM Technologies, a Stamford, Conn.-based consulting firm. “It provides the necessary resources to run an application as if it is the only app living on the operating system.… App containers have root access, direct access to libraries. This is not something you can do with standard architectures.”
One advantage of containers is that they reduce the amount of IT infrastructure – including compute, network and storage – that customers need to purchase in order to run apps.
“Containers are really interesting. It’s almost the next version of virtualization that’s a little more efficient,” Adams said. “Say you want to stand up 10 virtual machines, and each of those virtual machines is 10 gigabytes in size. If you stand up a 10G container in Docker, it shares more resources under the hood. It wouldn’t consume even close to 100G of resources because of the way Docker does things.”
Containers offer “increased efficiency and increased control over the app you’re trying to deliver,” Kleyman said. “A containerized app allows for real-time, cloud-native performance. That’s the big thing. They offer a greater degree of isolation and looser coupling on layers of virtualization than traditional approaches. This isolation provides a greater degree of reliability and also greater amounts of control.”
Government software development shops like GSA’s 18F software innovation program are likely to be the first to deploy Docker and other container technologies.
“Any app moving towards web services or a Service Oriented Architecture is a good candidate for containerization,” said Mark Ryland, chief solutions architect at Amazon Web Services Worldwide Public Sector. “Docker is a standard that has taken off like wildfire. All the major cloud vendors – Google, Microsoft and us – support the Docker container format. The government is going to like it because it is a de facto standard.”
Docker rivals include CoreOS, Canonical, Spoonium and Flockport.
The main challenges facing containerization is that the technology is brand new and relatively untested. A good sign that Docker is ready for government apps will be when it is supported in enterprise-class IT management systems like VMWare’s V Center, Adams said.
“Right now, you can’t spin up the containers or run virtual containers or create high-availability clusters – all the stuff that you want to do to create a virtual computing world,” Adams said. “There is no central management console to do it.”
In government circles today, Docker is only being used for development and testing applications, but Adams says she has seen government RFPs for development and test environments that request Docker support.
“This is a technology that people are going to want to follow,” Adams said.
Editor's note: This article was changed Jan. 15 to clarify Mark Ryland's title and to add RedHat to the list of Docker supporters.
Carolyn Duffy Marsan is a writer based in Milwaukee, Wisc., covering enterprise technology.