Agencies are mixing physical servers, virtual machines and containers to get the required storage and computer resources they need to maximize performance.
The structure of data centers continues to shift as new technology provides IT leaders with enhanced options. One of the choices government decision-makers face today is how to prioritize the use of virtual machines and containers in their environment so they have the needed computing power without excess waste.
In recent years, application owners have come to hold a stronger voice in technology hardware and resource decisions, a significant change from just a few years ago when these choices fell to enterprise IT managers.
The move comes as the backbone technology providing computing power and storage also changes. Cloud computing, virtualization and containerization provide new compute options, leaving technology leaders awash in possibilities.
This is a far cry from the days of the mainframe, where computing power was tightly monitored and rationed. Today, agencies can access computing power as needed. With traditional hardware commoditized, the question now becomes how to navigate this new virtual-focused landscape and particularly how to leverage available computing options to maximize performance.
Calling on virtual machines
VMs allow application software and the operating system to be independent of the hardware.
Organizations can deploy an OS for a virtual machine on a generic server and use a hypervisor to manage deployments. Before the hypervisor, the OS had to be customized specifically for the hardware it ran on, but that is no longer required.
Before VMs, server utilization often ran well below maximum thresholds, since every server was sized for the spikes or heaviest loads expected. Now, VMs allow organizations to run closer at a higher capacity level and eliminate unused or redundant resources. The VM solution offers a straightforward replacement for the appliance archive. Agencies can eliminate tape and consolidate resources while becoming hardware agnostic. What’s more, VMs provide proven security and broad API support for applications with multiprotocol requirements.
The growing use of containers
VMs act in the way that most people envision the cloud. They can scale up and down dynamically, providing a level of burst scalability as required, although it remains constrained to a point. That is where containers can play a vital role, especially as organizations look to develop new applications with potentially rapid compute needs.
Containers enable the orchestration of complex applications in any hardware environment. Containerizing object storage allows agencies to effectively manage it on their hardware platform of choice without being limited by the more monolithic deployment schema of traditional object storage.
Containers are standalone images with everything included – code, runtime binaries, library references, system tools, and required settings – to execute when a system needs. Containers run on a unique engine with an orchestration tool to manage multiple containers and engines at once. This is logically similar to the VM hypervisor but without the OS requirement.
That is important as new deployments can be developed and tested in a containerized environment without moving other applications for fear of exceeding available capacity.
A shift in mindset
VMs can help organizations manage their computing power to create a dynamic system that operates more as the cloud was intended. While traditional cloud storage can run at full capacities, technology managers have transferred their mainframe mindset (where every application must have ample space for potential burst) to this new technology.
Containers act as a development platform that turns the traditional development structure inside-out. In the past, organizations developed applications on their stack, rolling them outside their environment once they were ready. But containers now allow organizations to rapidly increase application capacity as needed, enabling them to build applications in a safe environment before bringing them inside.
This is paramount when building platforms that leverage big data and analytics, such as facial recognition, financial transactions, internet-of-things information for secure areas or battlefield resource tracking.
Which is right for you?
We often see a mix of physical, VM and containerization within the data center, in both private- and public-sector organizations. The hybrid model is the most prevalent option for the mix of datasets and applications managed in a modern enterprise.
The good thing is that the selection between VMs and containers is not an either/or choice. Instead, government agencies must anticipate the balance of their data center. Organizations that require storage and sustained computer power will want to leverage VMs, while agencies that prioritize development and innovation will lean more toward containers.
VMs and containerization are just two of the tools that have changed hardware value. Government agencies will benefit greatly from adjusting their technology strategy to meet this new paradigm. Technology leaders must understand the ecosystem they want to create and then build a storage and computing strategy that aligns with those goals.
NEXT STORY: Detroit tests cloud-enabled pavement monitoring