API Enablement

With digitization, open banking application program interface (API) regulations, and omni-channel customer experience becoming the high-priority initiative for most organizations, the core or mainframe renewal, and APIficiation is necessary to ensure the success of these initiatives. Creating a successful API is not easy and creating APIs for mainframe is even more troublesome, because it requires deep integration and data-exchanging capabilities between legacy and newly written applications in the API digital world.

Existing mainframe systems have been built up over the years and encapsulated enormous valuable business logics and data. These rich and mission-critical sources of business services and information are critical for digital transformation. As enterprises address business transformation, they will have to make their mainframe data and applications available as APIs in the total digital solutions to be able to extend their digital reach for consumption through social, mobile, analytics, and cloud (SMAC) channels.

We leverage a number of in-house tools and frameworks to accelerate enablement of new user experiences and functionalities through microservices and API’s for clients running their core on mainframe, in an incremental manner.

Additional Reading

Embracing Microservices Architecture in Telecommunications

Simplify integration through Hypermedia APIs in the Digital Economy

API Enablement of Mainframe Applications

How can organizations with large legacy footprint plan their API journey?

A rapid API roll out with a robust funding approach

Blockchain and IoT: The future of API economy

Legacy Decommissioning

As clients embark on simplifying and optimizing their existing mainframe footprint and migrating out of mainframe, having a comprehensive decommissioning strategy is essential to ensure that unused or migrated applications are actually retired and the data is archived for any future usage.

Our decommissioning solution is a step-by-step approach providing a comprehensive framework, to achieve speed-to-value, economies of scale, simplified landscape, enable agility, and reduce run cost.

The use of tools and accelerators help in the process of data analysis, migration, and archival during the decommissioning exercise. With Infosys, clients benefit from the following:

  • RunToRetire services for decommissioning portfolio applications and archiving data
  • Data retrieval as a service: Hot and cold retrieval. Most of the customers are reluctant to decommission due to lack of a retrieval strategy

Additional Reading

Mainframe Legacy Decommissioning

Legacy Decommissioning - Data Archival and Retrieval

Batch Offloading

Most organizations’ mainframe workloads consist of batch processes that could be classified into one or more of these categories:

  • End of day/month/year processes
  • Periodic batch/transactional processing
  • Report and statement generation
  • Data ingestion and extraction into mainframe database (DB2, IMS, VSAM)
  • Data transformation and transmission
  • Data archival and purge

Besides periodic processes, reporting and data related jobs are medium to low value to businesses and are expensive to run on mainframe. We help clients identify and offload workloads on the most suitable platform, deliver better performance, and bring the costs down significantly, with leaner and lighter mainframe.

  • Low value, poorly performing jobs that involve processing of large amount of data are best suited to be offloaded to Hadoop platform – an alternative to mainframe workloads because of its scalability, cost-effectiveness, and faster processing time
  • Transaction and periodic processes/jobs that are mission critical and involve bulk processing are suited to be moved to Spring Batch —a lightweight comprehensive batch framework
  • Batch processes on mainframes are typically involved in some kind of Extract, Transform and Load (ETL) processing be it transactional, end of day/month processing or reports generation. Owing to the similarity in their concepts, processes and terminologies, the ETL platform has evolved as a suitable alternative for running existing batch workloads migrated from mainframe

Additional Reading

Improve operational efficiency through batch modernization

Migration of Mainframe Batch Workloads to ETL Platforms


Mainframe re-hosting enables organizations exit legacy infrastructure, while retaining the capabilities of scalability, resilience, availability, and low-cost. This approach is suitable for enterprises that still see a great value in their existing applications and at the same time, find optimizing costs as the biggest driver to modernize.

Re-hosting refers to 'lift and shift' strategy of legacy applications and data to an emulation platform empowered with specialized re-hosting software such as Micro Focus Enterprise Server, TmaxSoft OpenFrame, Oracle Tuxedo ART, Raincode, and modern databases. Apart from reduction in total cost of ownership (TCO), re-hosting enables organizations to extend their IT capabilities through integration with multiple enterprise products and solutions.

Additional Reading

Lift-and-shift to cloud at zero risk with Infosys

Microsoft Azure: The right platform for mainframe modernization

Accelerate your mainframe migration to AWS

Strategy is key to migrate from mainframe to cloud

Accelerate performance, Operational efficiency and Reduce run cost of core systems

TidalScale based Re-hosting

While re-hosting is a great option for simplifying IT landscape, it is not suitable for organizations with large workloads —upwards of 5000 million instructions per second (MIPS), running on mainframe. While re-hosting software can only scale up vertically, the ones that can scale horizontally have not been proven capable of handling large workloads.

At such instances, we enable organizations create and run large virtualized infrastructure using TidalScale – which can easily support and run extremely large workloads. TidalScale Software-Defined Servers are self-optimizing, use standard hardware, and are compatible with all applications and operating systems. They can deliver in-memory performance at any scale.

Additional Reading

Lift-and-shift to cloud at zero risk with Infosys

Infosys invests in TidalScale

TidalScale – An Introduction

TidalScale – A pioneer in software-defined servers

Re-Engineering Legacy Systems to Next Generation Architecture

When the existing functionality and processes implemented on mainframe are no longer able to support the newer business model and functionality, or are difficult and expensive to scale, re-engineering to a cloud native architecture becomes a high potential modernization option.

This creates a new application with similar performance and re-imagines user experience and new business functionality, typically done using cloud-native techniques, leveraging micro-services, containers and decoupling; allowing customers to take advantage of the scale, innovation and security of the cloud.

Re-engineering implementation and delivery is accelerated using Infosys as well as partner platforms and tools. Infosys helps clients with:

  • A platform based re-engineering approach focusing on solutions for high transaction mission critical applications as well as
  • Industry solutions to drive re-engineering from a business perspective to create next-gen/future digital enterprises

Additional Reading

Re-engineer legacy systems to take full advantage of next-gen technologies

Strategy is key to migrate from mainframe to cloud

Microsoft Azure: The right platform for mainframe modernization

Accelerate your mainframe migration to AWS

DevOps for Mainframe

DevOps has been evolving over the last few years and at its most basic; it is the process of bringing development, testing, and operations together to share processes and procedures. On mainframes, it requires a huge mindset change as the traditional mainframe developers are used to quarterly / monthly releases and are not used to continuous delivery. Continuous delivery is an approach through which software passes from the workstation to production environment so that value can be delivered to stakeholders with as little delay as possible.

In order to take this forward to our clients, we have developed a lighthouse program in accordance with the consulting team. This program will define the agile, continuous integration (CI)/ continuous deployment (CD) processes specific to mainframes and provide an enablement plan to shift from the traditional waterfall approach to new ways of working.

We will leverage its unique DevOps platform in accordance with partner tools from Compuware, Microfocus, and IBM to implement DevOps on the mainframe for its clients.

Additional Reading

Enabling DevOps on Mainframes for Legacy Modernization

Why DevOps

How DevOps can improve mainframe app agility

To get started, please send in your queries to modernization@infosys.com

Cookie Settings