Transitioning a whole delivery process for applications to a new set of tools and practices can represent a real challenge if the attempt is to reproduce design from the past and keep things as they were as close as possible. A "no-baggage" strategy means moving to a new destination with minimal belongings, focusing on transforming the core activities and tasks of the delivery system. The approach discussed in this blog post is about avoiding reproducing complex logic with new definitions and tools and can potentially speed up your transformation to new developments models for mainframe applications. The proposal is to not carry over habits and logic inherited from the past years, but embrace this "no-baggage" mindset and leap forward into new DevOps processes!
Development of mainframe applications over the last decades
The development of mainframe source code is a major topic for companies who have previously embraced this platform. As applications grew larger over time, comprising tens of thousands or millions of lines of code, developers needed a way to manage source components while following a reliable process. Back in the day, the Waterfall process was quite popular for long-term complex projects, with the benefits of having a list of well-defined tasks that would run sequentially. This development practice worked well when it was introduced due to the relatively small size of IT systems and a certain degree of simplicity and isolation. Several prominent tools, which are part of the legacy library managers family, were aiming to simplify the life of the developers, by providing a framework and facilities to help application artifacts progress through different phases.
In today's world, systems and applications are interconnected and changes to business priorities require agility and flexibility to implement new features. Companies can't afford to spend time developing new features through length siloed projects, where outcomes of deliveries are uncertain and can be jeopardized by higher priorities or constraints. Enhancements to existing applications or releases of new features are now meant to take weeks to implement, not years or even months.
Introduction of new tools and practices
To speed up the software delivery lifecycle, more and more projects are adopting a DevOps approach, implementing Agile principles. This adoption goes along with the use of development tools, that are specifically designed for flexibility and agility. The first and foremost tool of this family is Git, which is now widely used by more than 90% of the developers across both Distributed and Mainframe platforms. Building a DevOps approach based on Git has several benefits: it manages the source code (hash, versioning, auditing), it supports advanced collaboration mechanics (forks, branches, pull/merge requests, conflicts resolution) and is now often coupled with release management tools, for planning purposes. Along with Git, developers have implemented automation to ease their life when generating applications. This automation is defined in pipelines which comprise steps for building and packaging (known as Continuous Integration pipelines) and deploying applications (known as Continuous Deployment pipelines). Pipelines are extensible on demand, as they can be customized to add capabilities like source code scanning, automated testing and review/approval processes.
Mainframe applications can benefit from this new toolset and DevOps practices, and are not locked with legacy practices and tools. While gaining in agility and shortening their delivery cycles, application teams can collaborate more easily with distributed teams, as they speak the same language and use the same tools. Nevertheless, parts of the CI/CD pipeline for mainframe applications require specialized tools to cover some specificities: for instance, the build of programs can be driven through IBM Dependency-Based Build (DBB), and the deployment of mainframe artifacts can be performed by IBM Wazi Deploy. However, using modern pipelines for mainframe applications can be a challenge when it comes to migrating.
Adopting new workflows for managing applications' lifecycle
The very first step in adopting modern tools such as Git is the move of source code from PDSs to Git repositories. This step doesn't represent an impassable hurdle as companies typically reuse the segmentation defined in the existing solutions they already have. Still, mapping solutions such as IBM Application Discovery and Delivery Intelligence (ADDI) can help in validating the componentization, ensure accuracy and also ensure that shared components are assigned to the correct owning applications and teams or should remain shared in a specific Git repository.
What can represent some difficulties are the customizations created in legacy library manager processes to automate some aspects of the build phases. With the help provided by library managers' facilities, there is often some complex logic implemented in the processes or JCLs that compile mainframe programs, through extensive use of variable symbols (sometimes with substrings) and IF/THEN/ELSE conditions. Reproducing the same logic embedded in JCLs with new tools is tempting, to minimize the disturbances caused to development teams and smoothen the learning curve, but requires a lot of work, with very little benefit. Understanding the complexity of such processes is time-consuming, error-prone and really slows down the transformation efforts. Moreover, most of the steps are meant to be cleaned up as they are using library manager provided utilities that cannot be migrated or reused. The remaining logic, once the unnecessary steps are disposed, can get even harder to comprehend and is likely to not be reproducible with modern tools and CI/CD pipeline orchestrators or it implies diverting tools from their originally designed way of work and creating ad-hoc customizations, which can be hard to maintain over time.
The creation process for mainframe programs is well known: typically, a compilation step and a link edition step are sufficient to produce an artifact that can be deployed to another environment, although some pre-compilation steps can still exist. Along with IBM DBB, a set of Groovy scripts are available as samples in the zAppBuild public Git repository, to drive the compilation of the major types of mainframe artifacts (COBOL, PL/I, Assembler, REXX, BMS and MFS maps, z/OS Connect APIs, etc.). These scripts contain the typical tasks for building artifacts, with properties that can be customized if needed, and represent the best-of-breed, optimized process for creating deployable artifacts. Starting from these scripts is the recommended path, as they already contain the required processing that can be controlled through properties. These scripts can also be reused as examples to implement support for objects that are not supported out-of-the-box. Other languages and formats may be available in the future to define these typical compilation steps with IBM DBB.
To summarize, this "no-baggage" approach is aiming to simplify and to accelerate your transition to DevOps practices. Not only a step forward, it's a leap into a new way of building mainframe applications and managing their delivery lifecycle!
To understand how IBM can accompany you in this transition, please check the DevOps Acceleration Program page for more information. Also check how we can help enabling the enterprise DevOps transformation journey, through our technical library and assets.
Thanks to @Ian Mitchell @Lauren Li @Dennis Behm @Shabbir Moledina for reviews.