AI & Technology

AI Can Read COBOL - That’s Good News for the Mainframe

By Brian Klingbeil, EVP & Chief Strategy Officer at Ensono

The recent announcement around AI-assisted COBOL analysis has triggered a familiar cycle of headlines. The narrative suggests that a breakthrough in tooling signals the beginning of the end for the mainframe – and it is an easy story to tell. If AI can now read and interpret decades-old code at scale, the assumption follows that the barriers to migration have finally fallen away. But that assumption gets it backwards. For the many organisations whose workloads are ideally suited to the mainframe, migration was never on the cards. What AI actually does is make the mainframe proposition stronger—remediating the platform’s historical weaknesses while preserving its formidable strengths. Research shows that 93 per cent of executives continue to believe in the mainframe, long-term. 

The ability to analyse legacy code more effectively is, without question, a step forward. But rather than paving the way for mass migration, it often reinforces the case for staying put. AI can now illuminate the complexity that made migration risky in the first place—and show why the mainframe remains the right home for mission-critical systems.  

There is a persistent logic in IT that continues to shape how this conversation unfolds. It starts with the idea that if code can be translated, applications can be moved, and if applications can be moved, platforms can be retired. It is a neat progression, but it overlooks the realities of how enterprise systems function in production.  

Mainframes are not simply collections of COBOL programmes waiting to be rehosted, but rather environments that are built to handle specific types of workload with a level of consistency that is difficult to replicate elsewhere. High-volume transaction processing and I/O-intensive batch operations, for example, are core to how many organisations operate, and they are precisely where the mainframe continues to perform exceptionally well.  

Over time, these environments accumulate more than just code. From operational processes and monitoring patterns to recovery mechanisms and workarounds that have been developed in response to real-world demands. Typically, much of this is never formally documented or driven by process, but it does play a critical role in ensuring that systems behave as expected under pressure. 

Where migration becomes complex 

When projects run into difficulty, it is rarely because the source code could not be converted. The problems start to emerge when organisations attempt to recreate the broader system in a different environment. For example, data models may behave differently, and transaction flows may not align in the same way. The operational practices that once ensured stability no longer apply, and new ones have yet to mature.  

These are the factors that determine whether systems remain reliable and whether regulatory obligations continue to be met. Translating code simply relocates one component of a much larger, tightly coupled system, and AI does not remove this complexity. 

One of the most immediate impacts of AI, however, is visibility. Legacy estates that were once difficult to interpret can now be analysed in far greater detail. AI can map dependencies across COBOL and PL/I codebases, documenting logic, relationships and intent—so teams are no longer blocked when original authors have moved on. Work that once required months of manual effort can now be completed in a fraction of the time, with a level of accuracy that reduces uncertainty.  

More broadly, AI is transforming how developers engage with the mainframe. Modern code assistants now “speak mainframe,” allowing engineers to work with familiar tools—VS Code, Git, and natural language prompts—while targeting z/OS. AI-assisted CI/CD pipelines, automated impact analysis, and purpose-built integrations are accelerating feature delivery in ways that were previously unthinkable. Developers no longer face a stark choice between modern workflows and mainframe deployment; they can have both.  

There is also an emerging story around running AI on the mainframe itself. IBM’s Telum chip and Spyre accelerators are enabling organisations to execute AI models directly where their data already resides—reducing latency, strengthening security, and unlocking new use cases without data movement. The mainframe is no longer just the subject of AI analysis; it is becoming a platform for AI execution.  

This has practical benefits, allowing organisations to move beyond assumptions and base decisions on evidence, whilst shortening the time required to assess options.  

In many cases, AI-led analysis highlights that core systems are already operating in the environment best suited to their requirements. Rather than prompting wholesale movement, it points towards targeted modernisation. Applications are refined, interfaces are improved, and development practices are updated, all while retaining the underlying platform that delivers the required performance and resilience. 

The mainframe remains highly effective for workloads that demand consistency and throughput. At the same time, cloud and distributed platforms provide advantages for areas that benefit from flexibility and scale. Hybrid architectures are an acknowledgement that no single platform is optimal for every type of work.  

The distinction between language and platform remains critical. Code can be translated, but platform characteristics are not so easily reproduced. Transaction processing behaviour, I/O optimisation, availability engineering and decades of operational refinement cannot be lifted and shifted in the same way as application logic. These are the elements that define how systems function in practice, and they are central to why migration remains a complex and often high-risk undertaking.   

Even the AI models themselves reflect this reality. When asked to assess what COBOL analysis enables, they tend to focus less on conversion and more on the surrounding system characteristics that must be preserved. The implication is straightforward. Understanding code is necessary, but it is not sufficient to guarantee a successful transition. 

The questions that technology leaders should be asking  

The focus shifts from whether migration is technically possible to whether it is operationally and economically justified. AI provides better tools to answer that question, but it does not predetermine the outcome. Modernisation in place allows organisations to increase agility and reduce cost without exposing themselves to the risks associated with large-scale transformation. 

AI-assisted COBOL analysis is a valuable development—but the real story is bigger. AI is not hastening the mainframe’s decline; it is eliminating the frictions that once made the platform feel dated. Legacy code is now discoverable. Development workflows are modern. And the platform itself is becoming AI-enabled. The factors that determine platform strategy remain the same, rooted in workload characteristics and operational requirements—but for organisations running the right workloads, the case for the mainframe has never been stronger. 

The question now is how this new capability is used to make better decisions about where systems should run and how they should evolve. For many enterprises, the answer is clear: the mainframe is here to stay. 

Author

Related Articles

Back to top button