What crashing COBOL systems reveal about applications maintenance

Recent news stories have detailed several state unemployment insurance systems’ inability to handle the estimated 1,600% increase in traffic over the past few weeks. Many may wonder why these systems weren’t in a highly scalable cloud environment in the first place.

The reason is logical: While shifting certain applications and infrastructure to the cloud has become popular in recent years, the mainframe remains the platform of choice for mission-critical transactions and data, with its proven reliability and security.

Mainframes can typically handle much more data and transactions than cloud-based servers. A typical database on a commodity server might be able to support 300 transactions per second, or around 26 million per day -- an impressive number, but far short of the billions of transactions a mainframe can support. From a security perspective, as we’ve discussed previously, surveys have consistently shown that a large number of CIOs are surprised at how much work is required to bring commodity servers on par with mainframes.

So amid reports that “legacy systems are responsible” for unemployment system crashers, we need to make something abundantly clear: The issue is not mainframe hardware. Nor is it about the COBOL applications running on these mainframes.

Contrary to recent news stories, neither the mainframe nor the COBOL language is ancient or arcane. IBM continues to regularly update its Z mainframe hardware. In fact, one of the reasons the mainframe has thrived for well over a half century is because IBM is constantly reinventing it to support customers’ ever-evolving needs and business requirements.

COBOL is also continually updated -- most recently in September 2019 -- and it currently handles 95% of the world’s ATM swipes with no problem.

The real problem? These COBOL applications and the COBOL developer experience have been allowed to languish -- and what we’re seeing right now is the direct result of some states’ failure to properly update and maintain critical COBOL code. When this code was needed the most, this failure became evident -- at the worst possible time.

The hard lesson learned is that tech systems aren’t something that can be set up once and never looked after again. This is particularly true in the case of so-called legacy systems. It’s not that these systems aren’t up for the job -- quite the contrary -- it’s just that they can’t be expected to keep up with ballooning transaction volumes on the front-end, with absolutely no care and feeding on the back-end. COBOL developers cannot keep these systems up-to-date if they are not provided with a modern, familiar developer experience that enables them to be comfortable coding on the mainframe.

The private sector, unlike the government sector, acknowledges the increasing demand. Our recent survey shows development teams expect to increase the frequency of new mainframe application feature deployments by an average of 41% over the next year, compared to 38% for non-mainframe applications. This is a direct result of a proliferation of modern web-based applications placing unprecedented demands on back-end mainframes.

Some government IT teams may be lacking resources for application maintenance upkeep, but as we’ve stated before, something larger and more pervasive is happening: Critical mainframes and COBOL are being de-prioritized in favor of shinier, newer technologies like the cloud, mobile apps and artificial intelligence.

We're not sure if the states will be successful luring COBOL developers out of retirement. But even if they are, it's a Band-Aid, not a solution. States must prioritize maintaining and modernizing COBOL applications running on their mainframes – and upgrading the developer experience. This means empowering modern developers, regardless of skill level, to work on COBOL with the same speed and agility as they do other programming languages.

Not only would this insulate states from the unfortunate incidents that have plagued laid-off workers over the past few weeks, but it will support an effective knowledge transfer while also delivering an exceptional developer experience that helps government IT teams better compete in the industrywide battle for developer talent.

In an era of citizen-focused services -- and during these uncertain times -- government IT teams have learned they aren’t immune to the sudden load spikes that have typically been characteristic of the private sector, like e-commerce sites on Black Friday or stock trading apps on days of high market volatility.

This means that like their counterparts in the private sector, government IT organizations are software companies too -- requiring fast, agile development uniformly across the entire application delivery chain. This must extend from citizen-facing front-ends (often hosted in the cloud) all the way to back-end mainframes running mission-critical COBOL transactions. This is the key to leveraging the best and most cost-efficient hardware investments for their respective purposes, while serving citizens optimally in their time of need.

About the Author

Claire Bailey is the director, Federal, State and Local Solutions, Compuware.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.