The Mainframe at 60: Future-Proof Legacy Computing for AI

Noting the 60th anniversary of the IBM/360 mainframe, a chief technology officer looks at the current state of the technology and where it is heading in the next 60 years.

Vincent Alloo, Chief Technology Officer, Zetaly

May 3, 2024

5 Min Read
word mainframe under a magnifying glass
Wavebreak Media ltd via Alamy Stock

Six decades ago -- April 7, 1964 -- IBM released its landmark System/360 family of mainframe computers. In what was described at the time as a “$5 billion gamble”, the “bet-the-business” strategy turned out to be a smash hit, likened to the Ford Model T and Boeing’s early 707 aircraft in terms of its impact on society.  

Defying all expectations, mainframes are as useful to many organizations today as they were in the late ‘60s. But this comes at a price: Opaque costs and complexity, and today mainframes and the data they hold often sit in isolation apart from increasingly cloud-based infrastructure.   

Fortunately, enterprise mainframe customers are increasingly turning to a new breed of technologies in a bid to optimize performance, minimize costs and extract more business value. Their efforts could future-proof the legacy technology for another 60 years. 

Nowhere Near Retirement 

The System/360 in many ways blazed a computing trail -- not least by establishing for the first time a unified platform with differentiated software that could run on any machine in the same line. It was later described as setting “the benchmark for mainframe performance for many years”. Yet mainframes are far from being yesterday’s technology. In fact, they are still a vital part of the IT infrastructure in most Fortune 500 organizations. 

Related:5 Ways to Bridge the Mainframe Skills Gap

According to some reports, mainframes are used today by two-thirds of Fortune 500 companies. Mainframes are found in 45 of the top 50 global banks, eight of the world’s top 10 insurers, seven of the top 10 retailers, and eight of the top 10 telecom companies. Their reliability, security and scalability make them a popular choice for critical business operations such as real-time fraud detection, credit card processing and core banking.  

The Price of Legacy 

However, this power comes at a cost. One estimate claims enterprises are spending an average of $65 million each year on their mainframes, with a fifth of this cost going on maintenance. That becomes harder to justify in a world where cloud infrastructure costs are coming down while licensing and staff costs for mainframes continue to rise.  

One healthcare provider reportedly is spending as much as 75% of its annual budget on mainframe maintenance. In this context, even percentage point improvements in cost optimization can be significant. But opaque pricing and convoluted fee calculations make this difficult to achieve. 

Aside from the basic costs associated with keeping the lights on, there are challenges involving skills as mainframe experts retire with no obvious successors. There are complexities associated with how and where the platforms are used, and the programming languages they require. The latter can exacerbate connectivity and integration headaches, which reduce the business value the mainframe can add.    

Related:Why the Mainframe Is a Mainstay of Hybrid IT Infrastructure

The Next 60 Years 

Yet, all is not lost for big iron. In fact, there are three macro-trends that could usher in a renaissance for the mainframe, as long as enterprises are able to tackle the cost, performance and integration issues.  

First is the push for greater energy efficiency in the context of global commitments to sustainability and zero emissions targets. The technology sector’s share of global greenhouse gas emissions was estimated in 2020 at 1.8 to 2.8% -- as much as the aviation industry. It has been argued that mainframes can actually offer reduced power usage and emissions versus the cloud. But organizations will need to find more effective ways to optimize performance going forward if they want the mainframe to stay relevant in the long run. 

They also need to think more carefully about how mainframe technology can support artificial intelligence projects. It’s already fair to say AI is fundamentally transforming the way businesses operate, turning to predictive analytics engines capable of anticipating market trends and generative AI (GenAI) that could supercharge productivity in customer service, software development, and other use cases. PwC estimates that AI could add $15.7 trillion to the global economy by 2030. IBM certainly sees a seat at the table for the mainframe, having engineered its next-gen Telum processor with AI in mind. As the central chip for the IBM Z and LinuxONE systems, the 7-nanometer, eight-core microprocessor will deliver enhanced performance, security and availability, and enable real-time AI embedded directly in transactional workloads. 

Related:The Arguments for Open Source in Mainframes

The final trend worth watching is quantum computing: an advance in computing power that will accelerate humanity into a new era of innovation and progress. Promising machines orders of magnitude more powerful than even the fastest of today’s supercomputers, it is still some years away. But when it arrives, quantum infrastructure will be truly transformative for businesses. And there can be a place there too for the mainframe. 

So how can businesses leverage their historic investment in mainframe to take advantage of these trends? One capability they need is powerful observability tools to monitor, analyze, and visualize system behavior and resource usage. Such tools would deliver real-time insights into performance, availability and health, allowing organizations to optimize resource allocation and proactively address issues. That will enable them to optimize energy efficiency in line with sustainability and budget targets.  

The same real-time insight could shine a light on murky vendor pricing and fee calculations, providing the visibility and meticulous oversight organizations need to transform cost management. That is a way only to avoid unexpected and inflated invoices, but also to be proactive about budgetary controls and expenditure monitoring. 

Finally, it’s about integrating the mainframe into a hybrid IT infrastructure -- so it can add value to AI and, one day, quantum projects. This boils down to data. For too long, mainframes have been somewhat isolated from other infrastructure systems because of limited support for modern formats like JSON. However, now there exist tools that can effortlessly collect mainframe data via simple APIs, analyze and enrich that data, and then expose it seamlessly to third-party systems via secure APIs. 

In this way, organizations can finally leverage the mainframe in a wide variety of business use cases, from predictive AI analytics to high-powered fraud prevention. This is not just about making the mainframe future-proof, but about giving the technology a central role in shaping the enterprise of the future: a sustainable, high-performance future where better business decisions are made at the touch of a button. 

About the Author(s)

Vincent Alloo

Chief Technology Officer, Zetaly

Vincent Alloo is the CTO of Zetaly, a software company that helps large enterprises to better observe, manage and optimize the performance of their mainframes. With more than 15 years of deep technical experience developing, implementing, and supporting complex technology organizations for fast growing companies, Vincent drives product-oriented strategy at Zetaly for growth, quality and profitability. Prior to Zetaly, Vincent held multiple C-level positions at a variety of companies including Interim CTO/CPO at Valtus, Deputy CPO at Rayn, CTO at Alcuin, and CTO at CrossKnowledge.   

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights