This article is the second of a four-part series addressing Db2 for z/OS and modern development utilizing an Agile methodology and DevOps processes. To read the other two articles please follow these links:
In this article we try to define modern development terms in a manner which people most familiar with Db2 for z/OS might better understand. While Db2 for z/OS is not the first database that comes to mind when considering a DevOps development methodology, there is no reason why it shouldn’t be!
There is a lot of uneasiness that comes with not understanding a technology, technique, or philosophy. This can lead to a significant amount of discontent and resistance to these changes in the development world as they are presented to us. So, for someone in that group that has not had the opportunity to explore what is going on in the modern development world, here are a few definitions to help break the ice. Some of these definitions are inter-related so it’s best to read through all of the definitions and then read through them all again. Remember, in the end, it’s all still 0’s and 1’s!
If a bunch of definitions are too much to take on right now then, for now, consider the following. This is about remaining competitive in your own industry. Application development has changed. The competitive landscape has changed. No matter your industry, your company is also an IT company. Consequently, innovation delivery to your internal or external customers must outpace your competitors. To meet these challenges, efficiencies far beyond the methods of the past must be employed. Lacking such evolution won’t mean that you are not busy with processes of the past. However, it likely will mean that your enterprise innovation efforts have left the platform behind. Systems of records still exist and remain important. But critical data mass is accumulating elsewhere, on more nimble platforms. Is this inevitable. No! With the proper vision, and the right tooling, the mainframe can offer the same astonishing service, with quality, in support of the enterprise fast release cycles, through to the data sources, as it does for other qualities of service.
One cannot talk about DevOps without talking about Agile since many of the principals of DevOps expand upon the Agile methodology. The Agile development methodology abandons the strict waterfall development for a faster and more flexible development process that focuses on collaboration over documentation (documentation in not eliminated) and self-organization over strict rules and procedures for software development. This solves some of the problems associated with large scale software development in that with Agile the process is broken up into steps in which delivery of smaller pieces of a project can be developed, tested, approved, and integrated. The issues with scope creep, staff turnover, and user expectations are addressed with this piece-wise development process. The software development process is broken up into small teams of individuals working to very quickly deliver specific pieces of software in very small intervals of time. The result is that software can be delivered faster, at least for specific desired functionality. The downside is that the development process can be viewed as disjointed and chaotic. However, like with any project management, the proper leadership & adoption of the Agile processes brings organization.
So, with each new methodology comes a set of new terms that really defines the same sort of things that seasoned mainframe “old timers” are used to doing, only that they are in a different context. Application development is not really changing as much as these terms imply.
While they may sound silly, the concept of scrum and sprint is what enables a development team to effectively implement applications quickly and efficiently. Scrum, by definition, represents the framework of a small development team of 10 or fewer members that meet daily for 15 minutes or less in order to report on any progress, or impediments to progress, of development efforts. The idea is to break up complex software development into smaller pieces in order to enable quick and efficient implementation of these software components. The idea of a sprint is the fast iteration of software development. A sprint is a repeatable fixed-timeframe for delivery of a software component. The timeframe of a sprint varies, but typically a sprint lasts about two weeks. This is a key component of the DevOps and Agile methodology which can place considerable demands on Database Administrators to properly configure various database schemas in order to support the many sprints of application development.
The thing to remember here is that the teams are smaller and thus more numerous, the design and status meetings are shorter and more to the point, and the development process is brief. This requires fast response from data administration and database administration staff to ensure that the Agile development does not experience a bottleneck, and this is definitely an area where DevOps automation can help. It is also the primary reason why mainframe professionals need to get educated regarding this methodology and be able to adapt to the higher demands on resources, which may include hardware, software, database definition and configuration changes, and personnel.
DevOps is a conglomeration of development and operations in which the goal is to offer continuous software deployment and monitoring. It builds on top of Agile principals and expands on it with continuous deployment, testing, operations, and collaboration, utilizing automation whenever possible. This leads to a more robust and stable development processes offering continuous deployment and continuous integration of software components. Automation is a key component of the DevOps process and enables fast deployment of stable software components. There are many tools that support this automation. Some of the most popular tools include Git, Maven, Jenkins, Ansible, Chef, Splunk, and Urban Code Deploy. These tools assist a developer with tasks such as code management, testing, and deployment, which enables a single user, or small team, to manage their software and the testing and deployment of that software. These tools are all customizable and offer significant flexibility. A DevOps Engineer is the person responsible for implementing, customizing, and maintaining these various automation tools. The deployment cycles of new software components can be delivered in a matter of weeks utilizing Agile and DevOps versus months or years utilizing a waterfall methodology. There are software tools available to assist mainframe professionals with the responsibility of supporting DevOps on the mainframe. Through collaboration and automation, part of DevOps is to seamlessly transition features into product while meeting the enterprise non-functional requirements (security, performance, quality, etc.). There is a wealth of information and freeware available on the internet to get mainframe professionals up to speed.
Integrated Development Environment (IDE)
An IDE is basically a text editor with certain enhancements that aid in software development which may include, but are not limited to, a source code editor including syntax correction, an integrated debugger, a compiler, build automation, and in some cases version management. The purpose of an IDE is to aid in software development and subsequently speed up the development process. IDE’s are being quickly adopted as the gold standard for software development. VSCode by Microsoft is a freeware IDE that is enhanced by thousands of extensions that support a wide variety of languages, including Db2 SQL. IDE’s can be compared to ISPF Edit, but are more graphical in nature and can have features and capabilities far beyond ISPF Edit. This could potentially mean, however, that source code is stored outside of the mainframe but not necessarily so.
A software framework provides a layer of abstraction that provides generic functionality in an effort to speed up the application development process. Frameworks can be incorporated into a variety of programming languages and IDE’s, and they can be used to build a variety of online or batch transaction processing applications. In some cases, very powerful applications can be built quickly, but often these applications have relatively poor performance with little chance for tuning. A lot of very large unmanageable SQL is generated by software frameworks, and is the bane of many performance minded DBAs.
It is difficult to draw comparisons between a software framework and z/OS based mainframe code development. However, software tools such as Telon or Micro Focus were perhaps somewhat close in comparison.
Continuous delivery is a software development approach that attempts to produce software in short cycles. Continuous delivery is an important part of the DevOps process. This sort of development cycle allows for incremental software fixes, changes, or additions to be delivered quickly in an effort to support the DevOps and Agile methodology and get software into production sooner so as to get enhanced functionality and/or market advantage.
Continuous delivery is already a reality for the Db2 12 for z/OS professional. So, rather than getting new features every 2 to 3 years, new features are incorporated into the Db2 12 maintenance stream and enabled via the activation of new functions once the requisite maintenance has been applied.
Continuous integration is the process of continually integrating code changes from multiple sources into a single copy, primarily through automation. This allows multiple software developers to modify code and merge their changes into a single master copy. Continuous integration shouldn’t be confused with continuous delivery, but instead should be the thing that happens prior to continuous delivery. This is a key component to the DevOps process. The concept of continuous integration can also apply to database DDL since it is in essence code. Once Database-as-code is part of the continuous integration process, it is now a candidate for continuous delivery.
Continuous integration is not necessarily a new concept for the mainframe professional. Mainframe development has relied on teams of application developers working together on a piece of software.
Git is open-source software for tracking changes in any set of files. Its purpose in programming is to maintain a repository of software source code, and to track changes and maintain version-control of the source code. Git is quickly becoming the gold standard for software source code management. Git has a server component that is used to store a central repository of source code, and track all of the changes to that source code. Organizations can set up their own Git server, or choose from a variety of web-based Git servers. An individual can then download software source code that is shared among several developers and Git will track and merge the changes made to the software. Git can be used to maintain database DDL and is being utilized more and more for just that purpose, enabling a portion of the concept of “database as a service” so that Application Developers and Database Administrators can coordinate and track changes to database structures and deploy databases to support development activity quickly and accurately. Version control, or change control, is essential for a development environment that supports continuous development and continuous integration.
Git can be compared to software tools, such as Endevor, CCS or Panvalet which have assisted application developers with the tracking of changes and change control for years. The advantage of Git is that it is widely used and accessible from multiple platforms and operating systems, including Git for z/OS!
Tools can be utilized in the automation of the building and testing of software. These software tools can automate the compiling, deployment, and testing of application software to enable a streamlined build process and speed the application development and delivery processes. There are various software tools available, and examples of such software tools are Jenkins, Maven and UrbanCode Deploy. The attractiveness of these software tools is that they are highly customizable, enabling a DevOps Engineer to customize any combination of compiling, testing, deployment, etc.
There really isn’t a comparison to traditional mainframe development in this regard because it is a relatively new concept that is coming to the mainframe, although at least Jenkins and UrbanCode Deploy work with z/OS. Traditional mainframe development often involved testing teams and coordinated deployment activities during scheduled outages involving multiple teams. These automated build tools are nothing like that, but will be impacting the mainframe world when Db2 for z/OS is involved as the data server.
DevOps is all about the unification of all the processes and automation involved in modern application development, and the DevOps Engineer is the person responsible for implementing and maintaining these various components. This position not only requires an understanding of the development architecture and philosophy, but may also require the technical skills to implement and maintain the various software tools, such as Git, Jenkins, and UrbanCode Deploy, that enable automation of various development activities. This position is key to implementing the DevOps culture in an organization, and how that organization will control application development, testing, and deployment, while utilizing automation whenever possible.
There can be some comparison between a DevOps Engineer and perhaps a Systems Programmer for z/OS, although the DevOps Engineer is more on the application development side of things than a Systems Programmer. However, Systems Programmers have relied on things such as console automation scripts and SMS ACS routines to automated certain tasks. You could also compare it to job scheduling software, which basically took the task of running jobs (submitting batch card decks if you’re old enough), checking return codes, handling dependencies, etc.
Database as a Service
This is the meat and potatoes for a Database Administrator or possibly an Application Developer who has the ability to manage database design and implementation. The concept is that software infrastructure exists that enables the collaboration and automation of database object definitions and changes, the management and deployment of those definitions to a target data server, and the monitoring for standards. This is the core of the IBM DevOps Experience, to setup, manage, and deploy the database objects in support of Application Developers as part of an overall DevOps culture of application development. Data definition language (DDL), in essence, becomes code. The DDL is source controlled and deployments managed within a set of tools, including the IBM DevOps Experience.
Hopefully, this article served to break the ice and take a bit of the mystery surrounding the modern application development infrastructure. In our next article we’ll address another mystery surrounding the available services on z/OS, primarily surrounding the IBM Unified Management Server. If you haven’t yet please read our first article on traditional Db2 development and modern development Traditional Db2 Development and Modern Database Development.