By Tim McKeoun (Timothy.McKeoun@ibm.com) and Benedict Holste (benedict.holste@de.ibm.com)
Hallo zusammen! Ich heiße Tim und ich bin neu bei IBM und auch neu in diesem Teil der Welt! Ich komme aus den USA, aber meine ganze Familie kommt aus Deutschland. Es war mein Traum in Europa zu leben und ich war sehr glücklich, dass IBM mir diese Möglichkeit gab. Ich habe Erfahrung mit IBM Z, da ich als Mainframe Software Entwickler bei einer Versicherung gearbeitet habe. Nach 3 Jahren in der Softwareentwicklung, hatte ich eine neue Stelle als Developer Advocate angenommen. In dieser Stelle hatte ich viele z/OS Developers vertreten und neue Werkzeuge und Prozesse eingeführt, alles mit dem Ziel den Mainframe zu modernisieren. Es hat mir sehr viel Spaß gemacht und ich bin sehr gespannt, wie ich meine Erfahrung von der Kundenseite in meiner neuen Arbeit bei IBM einbringen kann. Ich habe auch Glück, dass ich tolle Kollegen habe und es freut mich, dass ich mit Benedict diesen Blog über das sehr interessante Thema Db2 DevOps Experience schreiben darf!
Why DevOps is important from a Db2 for z/OS perspective
Through my experience as both a mainframe developer and someone who helped modernize the mainframe, I quickly realized how necessary a DevOps transformation is. DevOps is all about breaking down silos between departments and groups of people, and equipping everyone with the tooling needed to get the job done in the most efficient way. One such group that would benefit much from this type of relationship is that of application developers and database administrators. Far too often, developers reach a “roadblock” in the development process when a table change is needed. By empowering developers with the correct tooling to take care of these database tasks on their own, it both gives time back to the developer and the database administrator. Less wait time and more time to focus on more critical tasks sounds like a huge win to me!
Another important part of DevOps is where you store artifacts – Silos can exist both theoretically between departments and literally when, for example, DDLs and source code are stored in different places. By storing these in the same place, it allows developers to be closer to objects that are traditionally only handled by DBAs, fostering a more DevOps like environment. When source code and databases use radically different tools or processes for deployment, it just keeps them more siloed, so using the DevOps experience, they can all be integrated into the same pipeline.
Show, not tell – Overview of the Db2 DevOps demo
As a team, we realized the best way to show how this problem can be tackled is through a demo of the Db2 DevOps Experience, IBM’s approach of how database changes can be treated just like source code changes.
Our demo addresses the following two personas:
We used the following tools to implement the demo:
- Visual Studio Code with the Db2 Developer Extension and z Open Editor extension
- GitHub
- Jenkins
- IBM Dependency Based Build
- Artifactory
- IBM UrbanCode Deploy with the Db2 DevOps Experience plug-in
- IBM Db2 DevOps Experience
Let’s do a quick walk through the demo architecture and steps.
- Visual Studio code, a popular integrated development environment (IDE), is installed and configured to work with a remote GitHub repository where the CICS/COBOL application code as well as the Db2 for z/OS DDL is versioned. A developer changes the code and DDL (e.g. add a new column to a table to implement a new function) and then commits and pushes it to the repository.
- The GitHub repository is configured to trigger a Jenkins pipeline. The pipeline contains several steps, from compiling the application source code, up to deploying the Db2 for z/OS schema changes and load modules.
- One step of the pipeline is compiling the COBOL source code. We are using IBM Dependency Based Build for that. The resulting load modules are stored in Artifactory.
- The final step of the Jenkins pipeline calls IBM UrbanCode Deploy to deploy the Db2 for z/OS schema changes and the load modules. For the Db2 for z/OS schema changes, we use IBM Db2 DevOps Experience (DOE). DOE provides various REST APIs to provision dedicated instances of Db2 objects and to register and execute schema changes. To make it easier to call those APIs from UCD, we created a small plug-in. DOE does all of the work that is required to implement the schema change (e.g. add a new column to a table). It compares the existing environment with the DDL from the GitHub repository and generates a migration strategy. In the best-case scenario, the strategy consists of non-disruptive ALTER statements that are executed automatically. If the strategy is disruptive (e.g. includes UNLOAD, DROP, CREATE), DBA approval is required.
This was just a quick run through the demo and components and there is much more to tell. If this article raised your interest in learning more about the topic and seeing the demo live, please reach out to us. We are looking forward to meeting and discussing with you. In the meantime, check out these resources for further information.
Resources