Why the time is right for hyper-converged infrastructure

Why the time is right for hyper-converged infrastructure

Hyper-converged infrastructure (HCI) combines standard servers with specialized software to create a pool of computing and storage resources that provide a scalable, cost-effective, adaptable and easily provisioned data center architecture.


What agencies need to know about hyper-converged infrastructure

HCI is the latest response to the need for a scalable, cost-effective, adaptable and easily provisioned data center architecture. Read more.

HCI opportunities in agency IT

Agencies have already leveraged hyper-converged infrastructure for sharing storage, virtual desktop infrastructure and workload consolidation. Read more.

The rise of hyper-converged systems results from the confluence of several large technology and business trends that span disciplines:

Hardware. Decades of Moore's law progress in processor performance and memory speed and capacity have rendered the average server overpowered for typical workloads, making the one-application, one-server design model obsolete. Furthermore, the inherent advantages of semiconductor memory and rapid improvements in nonvolatile flash density have allowed servers to far exceed the input/output performance of hard disks and approach price parity with high-performance disks for primary storage.

Software. Server virtualization allows today’s overpowered hardware to be carved into workload-appropriate logical units. That decoupling of physical and logical resource sizing enables workload consolidation onto fewer systems while providing flexibility to more precisely match application requirements with resource capacity.

More recently, storage virtualization software has worked in the opposite direction, enabling multiple, previously discrete spindles and disk arrays to be aggregated into arbitrarily large pools that can be tapped for block, file or object storage.

IT operations. Server virtualization and the demand for new applications -- the latter fueled by the digitization of business processes and the explosion in the number of mobile clients -- have led to a concomitant increase in system management complexity and overhead that's unsustainable without system consolidation, simplification and automation.

The Federal Data Center Consolidation Initiative is a direct response to that trend. According to a Government Accountability Office report, agencies had closed 3,125 data centers by the end of fiscal 2015 and plan to close another 2,078 by the end of fiscal 2019.

Business environment. Lingering recessionary effects have intensified business competition and heightened management emphasis on efficiency and sustainable growth, which together have created a sustained period of tight budgets for noncore overhead activities such as IT. According to the IT Dashboard, the compound annual growth rate for federal IT spending over the past six years was just 1.3 percent.

Meanwhile, federal IT departments spend three times more on operations than on service development and modernization. Translating some of that 68-plus percent share of the budget from routine maintenance to IT innovation has been an explicit strategy of the Obama administration -- and a clear opportunity for technology-fueled automation and efficiency.

The IT Dashboard results also show that agencies are more efficient than most enterprises because IT analysts peg the average IT organization's spending on routine activities such as system management, patching/updating, monitoring and troubleshooting at 70 percent or more. According to a 2015 IDC report, operational spending on government IT results from large, older data centers that aren't consolidated. Those centers often support inefficient, legacy, three-tier applications that are ill-suited to cloud deployment or redesign. The result is little available staff time and operating budget to respond to new service requests or support business innovation.

Private-sector organizations are increasingly turning to cloud services to support new applications, a practice quite familiar to federal IT managers working under the cloud-first mandate. Still, the cloud isn't the cheapest option, particularly for static, predictable workloads; those that need a lot of storage; or those with stringent government rules and requirements for backup, archiving and data sharing -- hence, the need for a more efficient and scalable architecture for internal data centers.

Collectively, the budget, operational and IT governance dynamics play right into HCI’s wheelhouse, making it a great fit for many agency workloads.

About the Author

Kurt Marko is a technology consultant and writer based in Boise, Idaho.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected