Gestell logoGestellBook a call

LogoGestell

© Gestell, Inc 2026

ArticlesLinkedinBook a call
Privacy PolicyTerms of Service

Insights

Go Back

What's the deal with Mainframe Modernization?

Mainframe neutrality reframes modernization as harmonization, grounding AI-driven change in business reality instead of platform ideology.

March 22, 2026/10 min read

In 1991, Stewart Alsop made one of the most confident predictions in tech journalism: the last mainframe would be unplugged by 1996. This prediction was ill-fated and he ate his own words in 2002, literally.

More than three decades later, the mainframe still runs the core workloads of the largest enterprises on earth. The global financial system processes trillions of dollars daily on it. Airlines coordinate hundreds of thousands of flights through it. Insurers, governments, banks, stock exchanges, and multi-billion dollar healthcare entities depend on systems that have run with little or no interruption for decades. These systems represent billions of lines of COBOL as well as batch and online frameworks, other languages, and legacy third-party software packages - their sheer size and complexity growing over the years as integration requirements have compounded into monolithic application systems spanning on-premises, colocation, and cloud environments. Many of these enterprises are now suffering critical skills gaps as expert workforces have retired, taking decades of institutional knowledge with them.

The Lessons of History

The current modernization hype wave has a historical amnesia problem. CASE (Computer-Aided Software Engineering) tools promised automated code transformation in the 1980s and while some succeeded, they created vendor lock-in and extreme difficulty embracing emerging technologies - as well as having become more and more expensive to license and maintain. Decades of COBOL-to-Java translation tools produced outputs that were technically runnable but operationally unusable, resulting in JOBOL and many failed transition efforts. There was some success: the rise of Micro Focus COBOL on distributed platforms offered an alternative path for many organizations and now holds much of the COBOL running off the mainframe (up to 40% of the global footprint of COBOL runs off the mainframe).

The failure rate for modernization projects today still sits at 74%. Costs routinely run into the hundreds of millions. When these projects fail, they do not just delay a CIO's roadmap - they can impede the fundamental operations of the businesses they were supposed to improve. The pattern is familiar: incomplete initial analysis leads to scope creep, scope creep leads to change requests, and change requests lead to budget overruns that can force companies to increase their mainframe and third-party licensing spend mid-project - deepening the dependency they set out to escape. Throw into the mix armies of consultants or off-shore resources and what began as a clearly laid plan becomes a haphazard and difficult to control ball rolling down a hill. In this market, hubris represents an existential mistake.

The reason these efforts fail is not that COBOL is spaghetti code. COBOL is a structured, efficient language that has evolved with new standards over the decades. The reason modernization fails is that decades of accumulated business logic, undocumented modifications, and institutional knowledge have been baked into these systems in ways no automated tool has yet fully extracted. Compounding this, most modernization projects are structured in waves spanning five to seven years - but the mainframe does not freeze while that work happens. It continues evolving with business and regulatory requirements, and most toolsets have no meaningful ability to synchronize those changing environments.

At the same time, the mainframe community has its own accountability here. For too long, the response to modernization pressure has been defensive, declaring the mainframe "legendary, not legacy". That has been understandable, and in some cases correct. But it has also meant that newer development practices, tooling, and talent pipelines have not been cultivated around the platform. In this defensiveness, the mainframe community perhaps has empowered its own critics.

Consider a simple date field on a green screen - a user types in a date, the application validates it. An entire industry emerged to paper over exactly this kind of interface with GUI wrappers, allowing modern-looking screens to sit on top of unchanged backends. "Lipstick on a pig," as it is sometimes called - an approach that allowed enterprises to appear to their users as though they were modernizing when the core system remained entirely untouched, creating new layers of complexity and shadow IT dependencies that make any future attempt to change the actual backend significantly harder.

In this world of diametrically opposed opinions and illegibility of the platform, Gestell advocates for a new sort of analysis with clear eyes, Mainframe Neutrality.

What is Mainframe Neutrality?

Neutrality is not indifference. It is a refusal to let platform loyalty or hype substitute for business judgment.

A neutral position recognizes that the mainframe neither "deserves" its legendary status in the eyes of its defenders, nor the pejorative one assigned to it by outsiders. It demands that the platform stand on its own merits, evaluated alongside every other architecture available, for the specific workloads in question.

More practically: neutrality shifts the starting question. Instead of asking "what do we do about the mainframe?" a neutral organization asks "how do we best serve our customers and the business?" and lets the answer determine the architecture. The mainframe is infrastructure. Infrastructure serves the business. When it does that well, it should stay. When a different platform would serve the business better for a specific workload, the path there should be taken carefully and with full knowledge of what is at stake.

It is worth noting that modernization pressure rarely comes from a single direction. Cost reduction is part of it, but organizations are equally driven by changing business requirements, competitive pressure, and regulatory demands that legacy systems were never designed to accommodate. A neutral framework is the only honest way to navigate that complexity - one that evaluates each driver on its own terms rather than defaulting to either platform loyalty or modernization hype.

Neutrality allows for modernization to be recast as harmonization.

Harmonization, Not Modernization

We at Gestell define legacy harmonization to be the "process by which the new is folded into the old, accelerating both."

We prefer harmonization to the term modernization as it represents more accurately how an enterprise can take a position of mainframe neutrality and focus more concretely on its customers. It is not a line-by-line rewrite into Java. It is not a band aid API layer or screen scrape that may break on the next update. Mainframe Harmonization is both a mindset and a process. As a mindset, it allows older infrastructure to be appreciated for what it actually does - rather than what it symbolizes - while remaining open to improvement and selective transition. As a methodology, it creates a foundation for decisions that are durable rather than ones that create new technical debt in the process of trying to eliminate the old.

Harmonization begins with understanding the system, and of the business it serves. From that foundation, many paths become possible: reducing unnecessary compute costs, integrating cleanly with modern platforms, or moving specific workloads to architectures better suited to them. The goal in each case is the same - increasing the business value of the overall IT stack, without mistaking motion for progress.

What AI Actually Makes Possible

Every previous wave of modernization tooling - CASE software, automated translators, API platforms - struggled in roughly the same way: it could process code syntax but could not understand intent or replicate performance. It could see what the code said. It could not understand what the code meant to the business that had been running it for decades. It could see systems, but not recreate the strength of mainframe native processes like CICS.

AI agents represent a genuine departure from this pattern, but only if they are built and deployed correctly. A well-trained AI agent operating on a mainframe environment can do things that were previously impossible at scale: ingest and structure millions of lines of legacy code, surface the business logic embedded within batch processes that run once a month, map dependencies across programs that have never been formally documented, identify dead code that has been accruing licensing costs for years, and flag the edge cases - the exception handlers, the annual processing jobs, the compliance implementations - that have historically been the hidden landmines in many migration projects.

The two most meaningful areas of impact of AI are on maintenance and development.

On the mainframe for maintenance, AI agents can accelerate the process of bringing junior talent up to speed on undocumented systems - surfacing context and logic that would previously have required years of apprenticeship. Beyond talent, AI can identify optimization opportunities that do not require modernization at all: dead code accruing unnecessary licensing costs, batch jobs running inefficiently, workloads that could shift to lower-cost processors without touching the core system. Looking further ahead, MCP servers will emerge as a common protocol through which AI agents can interact with mainframe systems under human supervision - making the mainframe a first-class participant in modern AI-enabled infrastructure rather than an island unto itself.

On the development side, AI agents can assist in the task of decomposition and analysis of massive monolithic codebases - tackling the most important challenge in any migration: understanding what the system actually does. From that foundation of understanding, specific workloads can be evaluated honestly, moved where it makes sense, and left where it does not.

What this looks like in practice is more varied than most expect. Modern AI models handle COBOL surprisingly well, and Java can run on a zIIP processor within the mainframe architecture itself - meaning modernization does not always require leaving the platform. AI can also identify workloads suited to lower-cost zIIP and zAAP processors, and aging third-party packages like older CASE tools and report writers are increasingly viable targets for AI-assisted replacement. The options are broader than the conventional modernization conversation tends to acknowledge.

The critical caveat is architectural. Legacy modernization toolsets were built to solve a specific and narrow problem - transforming code syntax from one language to another. That is a fundamentally different problem from understanding what a system means to the business that has run it for decades. Retrofitting a GenAI layer on top of that tooling does not change the problem it was designed to solve. The underlying assumptions, workflows, and output structures were shaped by syntax transformation logic, and AI sitting on top inherits those constraints. True AI-native tooling starts from the harder and more important question - "what does this system do, and why?" - and builds from there.

The second caveat is the data. An AI agent trained on general programming patterns will go naively into a mainframe environment - encountering the varied syntax, the esoteric utilities, the decades of accumulated workarounds - and produce outputs that look plausible but are functionally wrong. The mainframe is intensely human. It was built by people who had specific ways of doing things, and it will only yield to AI systems that have been trained to recognize and respect that.

Mainframe neutrality requires that both futures remain possible - the mainframe staying where it serves the business, and selective modernization where it does not. Anyone claiming only one of those futures is the right one is, in all likelihood, about to eat their words.

The Work We're Doing

Gestell is ultimately a business committed to a neutral framework (thus Gestell). We are ardent appreciators of history and optimists for the future.

The mainframe systems running inside the largest enterprises on earth are not legacy problems waiting to be solved. They are the accumulated labor of decades of careful engineering by incredible talent, and they deserve to be understood for what they are before any attempt is made to change them. That understanding is rarely straightforward - these environments encompass not just COBOL but Assembler, boutique languages, custom-written frameworks, and decades of third-party tooling that was never designed to be understood from the outside. AI can now make that understanding possible at a scale and speed that was not previously achievable. That is what harmonization actually looks like in practice.

This is the work we're doing at Gestell. If it is a challenge you are navigating, we would be glad to talk.

If you want to learn more, you can reach us at hello@gestell.ai.