Introduction
With Halloween creeping closer and closer, let’s talk about something truly scary…MANUAL TESTING! Much of the enterprise world still has their manual processes whether they like it or not. (And if you’re one of the lucky, fully automated few; then good for you!) As exciting and tempting as it would be to slap it into some automation or simply drop that process, it isn’t always that easy.
On our z/OS Service Test team, our goal is to apply our regression (and often disruptive) test scenarios before fixes go into the field. These scenarios vary in shape, complexity, and component. Some are automated in various tools such as Tivoli Workload Scheduler (TWS) or leverage REXX. Others, to this day, rely on manual operator interaction and intervention. To clarify, we don’t choose to manually execute these scenarios for fun or glory! These scenarios can appear straight forward in their steps, but usually involve disruptive activity or interconnected z/OS components. For example, some of these scenarios could involve terminating JES2 in an active, heavily used system. Or bringing down System Logger while confirming CICS, SMSVSAM, TVS, and other components recover on that LPAR. We previously haven’t found a way to automate many of these scenarios that allows us to verify the z/OS features important to us while offering cloud and pipeline capabilities to really grow our test automation. That is, until we started to utilize Galasa.
Galasa in z/OS Service Test
Galasa is an open-source framework that allows for integration testing across multiple platforms with DevOps and hybrid cloud capabilities. What we love is the solid support for the z/OS platform as well, including 3270 screen usage and z/OS abstraction through Galasa managers. Managers encapsulate boiler plate code and abstract complex functionality, such as the usage of z/OS REST services for z/OSMF. These managers are then leveraged within Galasa tests through method calls and annotations. For example, a common manager we utilize is the z/OS Console Manager for z/OSMF. It leverages the z/OSMF console services to submit commands and obtain the solicited and unsolicited messages from the system. However, instead of worrying about the REST calls and keeping track of the unsolicited key, the framework does all of this and allows me to focus on the Java code and the process that’s important to us.
We’ve noticed that Galasa works very well with what we want to achieve: automation of our manual scenarios while maintaining the same level of quality and coverage. As a test developer, I can focus on my Java code and designing the process that’s important to my team. In addition, I can do this without worrying about the inner workings or integrating another test product. I find that the z/OS features work well. While we’ve primarily used the z/OS Console manager, I’ve dabbled a bit with the z/OS Batch manager which has been useful for tracking active jobs submitted by users. As for the z/OS Console manager, I find issuing commands and getting messages, both solicited and unsolicited, to be very robust and enable a test to be developed as dynamic as needed.
Here is an example of Galasa Test code in Java I’ve written for demonstration purposes.
This is showing how to initialize z/OS Console and Batch objects through image tags, which are set and defined in property files. It also shows the use of annotations, such as the Before annotation for methods. The “Getting Started” section of the Galasa webpage goes into the test basics in more detail.
This is showing an example use of the z/OS Console manager to display IPL info as well as the z/OS Batch manager to display some active jobs. If you’re familiar with JUnit, you’ll notice similarities with Galasa Tests in Java.
Our Future Plans for Galasa
- Our Galasa journey has just begun, but we have many goals planned. Thus far, we’ve adapted four of our z/OS Service Test scenarios into fully functional Galasa tests. We have plans to automate more of our test scenarios to alleviate our manual test focus. This will also enable us to create a test suite so that it can be used with a variety of parameters and even targeted against other sysplexes or environments.
- Galasa managers can also be created and contributed. During our test development, we noticed common code being written in each test, or a resource that could be encapsulated within a manager class. We have a couple Galasa managers in development and hope to have them fully functional in our environment soon. In doing so, this reduces the complexity and technical debt of our code.
- At the time of writing this, our Galasa tests are primarily being written and tested locally within an IDE. However, this is not how we plan to run tests normally. The Galasa ecosystem allows for full automation of test processes. The ecosystem is a cloud native application that exposes microservices and APIs. Tests can be pushed and stored to the ecosystem which then can be kicked off from a CI/CD pipeline or from an IDE. We hope to integrate it with our Jenkins pipeline and allow for automated runs of these regression scenarios. Additionally, the ecosystem allows for data visualization, result archive storage (such as run logs or stored artifacts), security options, and other hybrid cloud features.
Summary
We feel that the Galasa framework is a great way to automate our manual z/OS scenarios with many options for growth. It not only allows us the coverage and control that meets our business needs. It also enables us to develop our cloud integrations and expand our automation pipeline to improve the lifecycle of these scenarios. We look forward to continuing our Galasa journey and sharing our progress along the way.
If you’d like to learn more about Galasa or details on how to get started, check out the Galasa Homepage. And feel free to reach out to us if you have any questions about our experiences with Galasa Tests or Managers. We’d especially appreciate any feedback on what you’d like to hear next, or if you’ve had similar or varied experiences.
Resources
Galasa Homepage
Galasa Getting Started