I would like to extend your very good explanation of DevOps. The life cycle part (SCLM) is only a part of DevOps, which originally started from the development area. The first name was Extreme Programming, also written Xtreme, which was renamed to Agile Programming, since programming is tightly bound with program design, the name was reinterpreted as Agile Methodology. This to emphasize a new way of thinking/programming which replaced the older methodology-s based over the waterfall principia.
For some years IBM licensed our product ITP Panorama Toolset which supports programming in a DevOps style.
It is currently built in to IBM Wazi Analyze using our product as the backend and IBM created a frontend to it.
It was also planned to take over our product into ADDI which has been only partly done.
If you have a very large customer having nearly unlimited amount of code and wanting to get analysis, transparency in it, you can also take our original product, see the page
If you need any further info do not hesitate to contact us.
Original Message:
Sent: 4/8/2024 6:15:00 PM
From: Scott Fagen
Subject: RE: DevOps Revolution on the Mainframe: How It's Reshaping IBM z/OS
Here's the point: before anyone called it "DevOps," Endevor, ChangeMan, SCLM and other "software change management systems" orchestrated the exact same set of processes that any set of integrated DevOps tools (e.g., Git + Jenkins + Artifactory) would perform for distributed software development. DevOps neither "discovered" nor "invented" this mindset, it has existed for decades. The only credit due is for giving it a whizzy name.
To be more blunt: telling people that they're out of control of have the wrong mindset (especially when they had the mindset before many DevOps acolytes were even in diapers) is unlikely to induce them to come to your toolset.
Clearly, the tools built to accomodate a far more variable environment can be used for mainframe development. I work with these customers every day. Adopting a "DevOps mindset" is not the issue, mainframe developers and operations staff have been keeping application integrity since the early 1980s with the existing tools set. The value proposition for the newer tools comes from:
1) Reducing the barriers for new developers to work with mainframe code - it's one thing to ask someone from the distributed side to come and work in COBOL and/or PL/I. It's another thing to make them learn a bunch of arcane tools that they've never seen before in a user interface that resembles Pong. What's the ratio of mainframe developers to distributed? 1:5? 1:10? 1:100?
1a) Keeping the source code for different deployment platforms in different repositories just makes life harder when having to design, implement, and test application code that crosses those platforms. It's more effective to allow someone with the appropriate skill to make changes "on both sides" and then test those changes. A common development repository eases this for writing code and a common deployment tool eases this for testing, especially when the deployment process needs to "stitch together" the various systems so they can talk to each other.
2) Reducing the complexity of deploying applications across multiple platforms - as applications become more entwined between different platforms, it only makes sense to centralize deploy capability so that they are appropriately synchronized and potentially backed out across platforms.
To your question, "If someone would start from scratch with the mainframe, who would build a cloud without DevOps and automation?" The answer is, "Nobody did. They just didn't call it DevOps, they called it Endevor: ENvironment for DEVelopment and OpeRations."
For more: CA Endevor – A History Lesson for Today's DevOps Pipelines
I'll sit back now and wait for the responses telling me about some edge cases that mainframe SCMs didn't have to deal with being core and important distinctions to DevOps.
------------------------------
Scott Fagen
Mainframe Evangelist
CDW
www.cdw.com/content/cdw/en/solutions/ibm-zsystems.html
------------------------------
Original Message:
Sent: Mon April 08, 2024 08:23 AM
From: Michael Grötzner
Subject: DevOps Revolution on the Mainframe: How It's Reshaping IBM z/OS
The value DevOps brings lies in the single responsibility for development and operation. It's not different teams with all the communication problems. You can always see what value lies in automation if you look at z/OS. I don't know of any environment that is more automated than z/OS operations. Just think about what is already automated via policies, and automation products like IBM System automation. WLM, Sysplex, Storage, ....
Operation was the main focus for automation in the past because environment and applications had been very stable and the most critical area was operations. That changes in today's world. New application and faster application changes are needed to adhere to business needs.
Whether I have multiple applications that end up in separate servers or on a few server for me is just a question about stakeholders and approvers for a process. In fact, for me a z/OS parallel sysplex is a cloud. Therefore, whatever value there is to run workload in a cloud is also a value for the mainframe. If someone would start from scratch with the mainframe, who would build a cloud without DevOps and automation?
The DevOps process for a mainframe might be more sophisticated, but other than that, I do not see any difference between a mainframe, and say a cloud.
To repeat myself: It's just a different mindset.
Where do you see the difference?
------------------------------
Michael Grötzner
z/OS I/O Configuration Architect
IBM Germany Research & Development GmbH
Original Message:
Sent: Mon April 08, 2024 03:13 AM
From: Brigitte Kroll
Subject: DevOps Revolution on the Mainframe: How It's Reshaping IBM z/OS
I couldn't agree more Scott. We implemented Endevor at customers 20 years ago, as the Mainframe needs to deploy to countless application instances and NOT as distributed systems to countless SERVERS.
So the question is really "what value brings the new deployment software and does it really fit for Mainframe deployment"
------------------------------
Brigitte Kroll
Original Message:
Sent: Mon April 01, 2024 05:29 PM
From: Scott Fagen
Subject: DevOps Revolution on the Mainframe: How It's Reshaping IBM z/OS
Geez, there's nothing like retconning to make your point.
All of the major mainframe source code management tools (SCMs) have had automated build and deploy capabilities that would work for unit test, QA, pre-prod, prod, etc. within the product for decades. By "major," I mean: Endevor, Librarian, Panvalet, ISPW and SCLM. There are obsolete others out there (e.g., BIM-EDIT, Alchemist), and even these have long had a phase oriented build and deploy capability. This functionality was appropriate for the environments and (especially) the constraints it was designed for: very expensive compute and hyper-expensive storage, where every cycle and byte was counted and managed. Deployment is generally to a small set of known environments and production deployment could be to a single set of libraries, even for hundreds or thousands of application instances.
So-called "modern" DevOps was designed for a completely different set of constraints: cheap compute and cheaper storage, automated, just-in-time provisioning of the underlying test environment and a need to deploy production to hundreds, if not thousands of server instances.
I fully agree that for many reasons, mainframe installations should adopt the tools and processes that have been developed to solve the scaling problems endemic to distributed software deployment. None of those reasons includes, "because there aren't automated processes in place today." The primary driver of resistance to these new tools is not, "we like working with stone knives and bearskins," more, "what's the value of changing these processes that have worked for decades?"
------------------------------
Scott Fagen
Mainframe Evangelist
CDW
www.cdw.com/content/cdw/en/solutions/ibm-zsystems.html
Original Message:
Sent: Sat March 30, 2024 01:39 AM
From: Saurabh Banerjee
Subject: DevOps Revolution on the Mainframe: How It's Reshaping IBM z/OS
Absolutely agree with your perspective Michael. It's evident that z/OS is designed for automation and integration, and while the focus historically leaned towards operational automation, the landscape is evolving. With the increasing frequency of changes to infrastructure, there's a clear shift towards automating deployment processes. The tools and capabilities to automate deployments across z/OS, middleware, and applications are readily available. However, as you rightly pointed out, the primary obstacle often lies in the mindset. Implementing automation requires a proactive approach and a shift in mindset towards embracing change and innovation. It's akin to any other CI/CD pipeline or DevOps initiative – the key lies in defining and executing the strategy. The real question now becomes not about the possibility or methodology but rather about seizing the opportunity. Who will be the trailblazer to implement and leverage this automation, thereby gaining a competitive edge in this space? It's a compelling challenge waiting to be embraced.
Yes, implementing a DevOps process for IBM Z, automating development build across different stages with continuous testing process
------------------------------
Saurabh Banerjee
Original Message:
Sent: Thu March 21, 2024 02:44 AM
From: Michael Grötzner
Subject: DevOps Revolution on the Mainframe: How It's Reshaping IBM z/OS
I agree that it's not a question of whether it can be done. z/OS is all about automation and integration. In the past, automation was more concentrating on automating the operations rather than automating the deployment. That was the more pressing problem due to the relatively small and infrequent changes to the infrastructure. The required number of changes increased over time. I agree that most of the deployments can already be automated. All tools, features, and functionalities to automate z/OS deployments, middleware deployments, and application deployments are available. The main barrier is the mindset. Just because it can be done, you still need to it. It's like any other CI/CD pipleline and DevOps process. You just need to define and implement it. Therefore, the main question for me isn't about whether and how, but about who will be the first one to implement it and gain the competitive advantage in this area.
@Saurabh, are you implementing a DevOps process for IBM Z in your organization?
------------------------------
Michael Grötzner
Original Message:
Sent: Tue March 19, 2024 08:30 AM
From: Saurabh Banerjee
Subject: DevOps Revolution on the Mainframe: How It's Reshaping IBM z/OS
For decades, IBM z/OS has been the workhorse of the enterprise world, powering mission-critical applications with unparalleled security and stability. But the IT landscape is changing. Businesses demand faster innovation and agility, which can seem at odds with the traditional, siloed approach of mainframe development.
Enter DevOps – a methodology that bridges the gap between development and operations. Many might think "DevOps on the mainframe? Isn't that an oxymoron?" Surprisingly, not! Here's how DevOps is reshaping IBM z/OS:
From Waterfall to Continuous Flow: Traditionally, mainframe deployments were slow and methodical. Code changes trickled down a waterfall of approvals, leading to lengthy release cycles. DevOps flips the script, promoting continuous integration and delivery (CI/CD). This means smaller, more frequent updates, allowing businesses to adapt and innovate faster.
Automation is King: DevOps is all about automation, and the mainframe is ripe for it. Repetitive tasks like provisioning environments, testing code, and deploying updates can all be automated, freeing developers to focus on bigger challenges. Tools like IBM Db2 DevOps Experience for z/OS are streamlining these processes.
Collaboration Conquers Silos: DevOps fosters a culture of collaboration between developers, operations teams, and security professionals. This breaks down traditional silos, leading to a more efficient and responsive IT organization.
Security Stays Paramount: Security is a top priority for z/OS, and that won't change with DevOps. DevOps tools can be integrated with existing security protocols, ensuring that automation doesn't come at the expense of data protection.
Modern Tools for a Modern Era: Gone are the days of cryptic green screens and arcane commands. Modern tools are making z/OS development more accessible. Version control systems like Git and user-friendly interfaces are attracting a new generation of developers to the mainframe.
The Future is Hybrid: DevOps isn't about replacing the mainframe, it's about integrating it seamlessly with other platforms. APIs and tools like z/OS Connect are making it easier for mainframe data to flow securely to cloud applications and other systems, enabling a truly hybrid IT environment.
The marriage of DevOps and IBM z/OS isn't just a fad, it's a sign of the platform's enduring relevance. By embracing agility, automation, and collaboration, z/OS is ensuring its place in the future of enterprise computing. So, the next time someone says "mainframes are dead," remember: the mainframe is evolving, and with DevOps as its co-pilot, it's ready to tackle the challenges of the modern IT world.
The Big Question: How long before the DevOps completely transforms the mainframe development, attracting a new generation of talent and solidifying z/OS's position as a cornerstone of the hybrid cloud future, or will the inherent complexities of the mainframe environment pose limitations on the full potential of DevOps?
------------------------------
Saurabh Banerjee
------------------------------