People photographed from above, sitting at a table.

A look back to the past

Who remembers the old IBM 3270 monitors from the 1970s? Standard models had 24 lines of 80 characters with a green font set against a black background and were based on punch cards. The monitors were also called terminals. There were no mice, cursors and graphics. The use of terminals ushered in the era of transaction processing in specialist units and company-wide electronic data processing (EDP) at insurance companies, banks, public authorities, car manufacturers, airlines and other large companies. Dialogue and batch applications were initially developed in Assembly, a programming language that is very close to machine language. Higher-level programming languages such as Cobol and PL/I came along later, primarily for business applications. Still in use today in large numbers, IBM mainframes represented a major global breakthrough.

Where mainframes are in use

Mainframes are in use at 71 per cent of Fortune 500 companies. Globally, mainframes are in use at:

  • 92 out of 100 banks
  • Six out of ten commercial enterprises
  • Ten out of ten insurance companies
  • 23 out of 25 airlines
  • Nine out of ten global health insurance providers

There are a total of 10,000 mainframe installations. While the majority are to be found in the US, there are roughly 150 installations in Germany, Switzerland and Austria. However, the number of mainframes is declining by about three to five per cent a year, resulting in development costs being spread across fewer and fewer units. There are many reasons for this, two of which include the consolidation and outsourcing of data centres as part of corporate mergers and companies opting for standard software. That being said, all companies face a similar situation: baby boomers, who played a pivotal role in shaping the technology, are set to retire in the coming years. This means that, from a demographic perspective, 75 per cent of employees with mainframe skills will exit the labour market in the next few years, while only half of the these can likely be replaced (2021; Forrester Research). For younger people, Cobol is old school. Java is where it’s at. This has major implications seeing as many of today’s core applications are still based on code from the 1970s. There is good reason for this, since IBM has ensured upward compatibility with each new generation of computer system and software. This is why we still find a lot of Assembly programs and large Cobol applications with more than 100,000 lines of code in use. These have hardened over the years and without question do their job very efficiently and reliably. What we are describing here are legacy systems.

Definition of a legacy system

In computing, a legacy system is an old method, technology, computer system or application program, ‘of, relating to or being a previous or outdated computer system’, yet still in use. Often referencing a system as ‘legacy’ means that it paved the way for the standards that would follow it. This can also imply that the system is out of date or in need of replacement.
Wikipedia

Trend towards mainframe players for hardware and software

In recent decades, we have witnessed a wave of consolidation amongst mainframe players. Whereas companies like Comparex and Amdahl once offered plug compatible mainframes, IBM’s zSeries is now the only range of PCMs available on the market. Alongside IBM with its DS8000 series, only HDS and DELL/EMC offer hard drive storage solutions for mainframes. The situation is similar for independent software vendors (ISVs). Broadcom acquired CA, BMC took over Compuware and the list goes on.

M&A in the tech sector is a lucrative field of business for private equity firms and VCs like Thoma Bravo and KKR. This has led to consolidation and a reduction in the number of providers, meaning less competition and, in some cases, monopolies and the all-too-familiar vendor lock.

Times are changing – a wake-up call

The market is aware of the situation, as surveys by numerous analysts have shown that a large number of mainframe users have already begun taking modernisation measures. Studies demonstrate that up to 90 per cent of companies have already looked into the issue.

At this point I would like to draw a distinction between two types of mainframe modernisation:

  • Modernisation of the mainframe development environment
    • Goal: A unified, agile software development life cycle (SDLC)
  • Modernisation of mainframe applications
    • Goals: Gain autonomy by replacing legacy programming languages with Java and reduce costs

Modernisation of the mainframe development environment

Environments were developed under the cascade principle for decades, which led to long development cycles in the individually developed environments. Conversely, specialist departments increasingly sought agile development methods with short product cycles like those available in the open-source space. Shadow IT has emerged as departments set up stand-alone solutions on their own.

In recent years, modernisation measures have gained a foothold in mainframe development. The 3270 TSO ISPF interface is increasingly supplemented by graphical user interfaces (GUIs) such as the Eclipse-based IDz and Topaz workbenches. Traditional mainframe-based source code management (SCM) systems are being replaced by git or other shared repositories. IBM’s zUnit and BMC Compuware’s Topaz for Total Test are gradually being rolled out to automate unit and functional tests by means of mocking functions. The deploy processes are automated using Jenkins Pipelines. Code coverage and quality gates are used for quality control purposes. Continuous integration and/or delivery, DevOps and agile software development have therefore found their way into the mainframe world.

However, establishing and implementing these DevOps tool chains requires a lot of time and effort as well as a good collaborative relationship with software vendors. Many small to medium-sized mainframe installations (<5,000 MIPS with fewer than 50 developers) are reluctant to take on the task, because the investment rarely pays off. It is difficult to generate enthusiasm for DevOps among older developers who prefer to use the 3270 environment (REXX and Clist functions), which is familiar to them and optimised to their needs.

Modernisation of mainframe applications

More options are available here, as there are a number of avenues open to companies to modernise their mainframe applications. However, it is important to know what the global corporate IT strategy is because the modernisation measures need to align with the architecture. We encounter different approaches to mainframe application modernisation.

Code transformation involves the automatic transfer of existing code to Java, with no changes made to the business logic. adesso transformer is based on a tool suite consisting of the components at|analyze, at|convert, at|edit, at|batch and at|test. Refactorable code can also be incorporated in the transformer by adapting it to the customer’s specific needs. The advantage of code transformation is that the migration can be planned. Testing (use cases) can be reduced to a minimum, while fixed price migrations are the norm based on well-founded analysis.

Replatforming exists in a wide variety of forms and is based on the lift-and-shift philosophy. Rehosting is often used as a synonym for replatforming. A distinction is made between solutions with and without recompilation. In the case of compiler-based solutions, recompilation is performed during replatforming, with the application being migrated to a Java virtual machine (JVM). In a solution without recompilation, applications, runtime environments and data are automatically migrated. The EBCDIC mainframe format is retained. The applications are migrated binary-compatible to a container on Linux/x86 architectures on-premise or in the cloud. This is without doubt an interesting option if you wish to quickly phase out zSeries systems on cost grounds, for example. Application modernisation then takes place in the next step. This is often described as a bridge technology.

Parallel to this, there are solutions that primarily decentralise application development to a client and/or server technology, whereby production and operation continue to be done on the host for the time being.

Traditional outsourcing could also be described to a certain extent as replatforming/rehosting, though the focus is not on application modernisation.

Replace is a strategy that involves replacing existing mainframe applications with standard software. A host of solutions are available on the market for different sectors, also from adesso. In many cases, however, mainframe applications cannot be replaced with standard software. If the primary goal is to take a mainframe out of operation, additional solutions such as code transformation are used for the leftover programs.

Reengineering, or recoding, is an approach that entails reprogramming and reorganising business processes. The goal is not to improve the quality of software maintenance. The aim is to reengineer existing systems on the basis of modern frameworks and programming languages, with the goal of replacing the legacy systems. Gartner’s propagated bimodal IT approach is disregarded, since the main goal is integration into existing modern architectures.

There is no ‘simple’ migration or replacement of a system, as every system has grown into a work of art and must therefore be treated with the appropriate craftsmanship. Which is why we at adesso transformer always use the term ‘hybrid modernisation’.

Conclusion

Mainframe technology continues to be a modern, established platform, proven once again with the new IBM z16. Long written off, the venerable mainframe will live on for many years to come. However, every company must decide for itself how best to meet the needs of its specialist departments and which technology is the right one to meet their needs. Demographic shifts and the replacement of mainframe skills will play a central role. There is no one-size-fits-all approach to legacy application modernisation. It always involves a combination of strategies.

However, mainframe modernisation can quickly become a ticking time bomb. Experience from recent projects shows that, depending on the initial situation, a migration project can take more than five years and cost tens of millions to complete. By that time, most baby boomers will have retired and the knowledge they held will have been lost.

My explicit recommendation is to analyse the current situation at your company with the assistance of neutral experts and suitable tools (to what extent do ‘legacy issues’ exist?) and incorporate the results into your IT strategy following an assessment by in-house architects.

We offer a full range of consulting services and solutions that spans the analysis and design stages, right through to implementation. Visit our website for more information on the services adesso transformer offers.

Would you like to learn more about exciting topics from the world of adesso? Then check out our latest blog posts.

Picture Ralf Achcenich

Author Ralf Achcenich

Ralf Achcenich is a Senior Business Developer at adesso transformer Deutschland GmbH. He has 40 years of professional experience in the mainframe sector. Initially on the customer side, then with various system integrators and IBM business partners as well as with various international hardware and software manufacturers. The main focus of his last employers was on mainframe modernisation.

Save this page. Remove this page.