The container future is here. It’s just not evenly distributed
- By David Egts
- Aug 28, 2018
Science fiction writer William Gibson once said, “The future is already here -- it’s just not evenly distributed.” He was explaining that things we once thought of as futuristic already were a reality for some people, but not everyone.
He may as well have been talking about adoption of Linux containers within the federal government.
While evidence suggests that the public sector’s interest in Linux containers continues to grow, many agencies remain on the fence. Whether due to budget, lack of information or other constraints, government adoption of Linux containers has been slower than it has been in the commercial space. Many agencies continue to view containers as exclusively for the cool kids in Silicon Valley.
The reality is that Linux containers are not only a viable option for government agencies, they may very well be necessary for their digital transformation strategies. Containers can help agencies accelerate application development and support their migration to the cloud and automation. Additionally, agencies that have adopted DevOps and agile development processes can use containers to get applications into production even faster.
So what’s the hold up when it comes to containers? Let us address some common concerns to help unsure CIOs better understand why and how they can make containers a part of their IT operations.
“I’m not ready because my legacy infrastructure does not support containers.”
Most agencies have accumulated much more technical debt than private-sector startups that do not have the burden of managing thousands of older, critical Unix and mainframe applications. But just because agencies may still be harboring legacy applications does not mean they cannot benefit from Linux containers. They just need to take a more methodical approach.
Agencies did not immediately move their Unix-based enterprise resource planning systems to Linux when the operating system was first announced. The migration to Linux was gradual, with agencies at first moving the low-hanging fruit of web and DNS servers before tackling more sophisticated projects.
Agencies with legacy infrastructures and applications can take a similar approach toward container adoption, and containerize applications one a time. Web front ends, for instance, might be considered a good starting point.
There is no need to completely burn down existing IT infrastructures to enjoy the benefits of containers. The new and the old can exist side-by-side and interact with one another. Nor do they need to exist in separate silos, because containerized web front ends can present data served from legacy applications and data sources. Over time, the remaining applications and data sources can be containerized or not, depending on whether or not it makes good business sense to do so.
“I don’t know enough about containers to even understand where to begin.”
A few years ago, when containers were first starting to get noticed, there admittedly was far more talk about their potential than there was educational information about how to realize their benefits. Things have changed. Today there is a wealth of instructional material to help federal developers, CIOs and IT managers make sense of how to deploy containers within their agencies.
This information comes in many forms. There are whitepapers, workshops, events, seminars and even coloring books. Many of these resources are specifically dedicated to the government. Whatever the method, the information is out there and educational material on containers is quickly becoming ubiquitous.
“I’m concerned about security.”
As with anything else, organizations must take the appropriate precautions to drive reliability and enhance security around containerized workloads. Provenance and supply chain are key: A container procured from a questionable vendor or site could contain vulnerabilities, including malware, and developers may not be able to discern when -- or if -- the container has been patched. Agencies should refer to vendor container health indexes to validate the security of their containers.
Health indexes are often components of curated container repositories or registries that collect, assess, and analyze a wide range of container metadata to help developers better understand the overall “health” of a container image. Containers are assigned letter grades (A, B, C, etc.) that take into account known vulnerabilities, missing or critical data, the age of the container and more. The lower the grade, the greater the potential for risk.
Vendor-provided containers that have received top health index grades are a great starting point, but they do not alleviate the security responsibilities of the IT team. IT professionals should never just blindly download and use containers, even those that have been deemed safe. Would reasonable IT admins do that with a standard application? They can trust, but they must also verify and take steps to secure the last mile.
Containers can help government agencies check off many boxes. Containers accelerate application development and enable organizations to better maximize the value of their DevOps and agile development initiatives. They are an ideal technology for government agencies looking to automate and modernize their infrastructures.
There are now very few reasons not to take advantage of these benefits. With a little bit of education, and an eye toward security, even agencies relying on an extensive network of legacy technologies can start down the path to a more evenly distributed future.
David Egts is chief technologist, North America Public Sector, Red Hat.