Ansible for IBM Z - Group home

Why is 13 the lucky number for Ansible

  

IBM z/OS Ansible Core 1.7.0 has released to Automation Hub,  Galaxy and is available in GitHub.  This release includes new modules  zos_archive,  zos_unarchive, Jinja templating in both zos_copy and zos_job_submit modules, enhancements, bug fixes and deprecation notices. This release comes with optional support and service for Automation Hub users and community support for Galaxy and GitHub users. If you are wondering why I chose a spooky image, well besides releasing only weeks away from Halloween, this our 13th GA release, it includes 13 enhancements, 13 bug fixes and released on Friday the 13th, a tremendous accomplishment. 


New Modules

As you know, great things come in pairs, so we released modules zos_archive and zos_unarchive that can manage all your compression needs for both Unix System Services (USS) and z/OS. There are no additional dependencies, these modules are leveraging existing applications, native and python libraries. When used together, you can use them in several data in transit use cases such as software maintenance where you might need to unpack an archive, a backup strategy or even for selecting data to be put into cold storage; none the less, these modules will improve the Ansible automation experience.

You can choose from formats:

  • bz2
  • gz
  • tar
  • zip
  • terse
  • xmit
  • pax

For example, imagine that some source files in an MVS data set needed to be archived into an existing data set and then unarchived, it would be as simple as this these 11 lines of code.

- name: Archive the contents of a data set with format terse.
   ibm.ibm_zos_core.zos_archive:
        src: "USER.DATA.SRC"
        dest: "USER.DATA.ARCHIVE.TRS"
        format:
          name: terse

- name: Unarchive a terse data set.
   ibm.ibm_zos_core.zos_unarchive:
     path: "USER.DATA.ARCHIVE.TRS"
     format:
       name: terse

Expanding on the prior example, let's assume the destination data set does not exist and want to specify the attributes of the new data set and should there be a duplicate data set, force deletion and when the module completes clean up and remove the source data set. This can be done with the following 12 lines of code.

- name: Archive the contents of a data set with format terse, create data set destination and remove the source data set on completion.
   ibm.ibm_zos_core.zos_archive:
        src: "USER.DATA.SRC"
        dest: "USER.DATA.ARCHIVE.TRS"
        force: True
        format:
          name: terse
        remove: True
        dest_data_set:
          space_primary: 10
          space_secondary: 3
          space_type: K

Jinja templating

Jinja templating is available natively in both zos_copy and zos_job_submit modules. In a previous blog, I discussed some use cases on how you might use templating to manage multiple configurations across various Logical Partitions (LPARs) with only one template . I also discussed how you can dynamically generate JCL specific to each system, it is important to note, all templating happens on the Ansible controller before the task is executed on z/OS. 


Jinja offers a powerful and reliable way to generate dynamic content, reduce redundancy and minimize both syntax, and source errors. With the Jinja templating enhancement , you can use all the standard Jinja filters and tests.  Filters allow for variables to be changed and are used in conjunction with a pipe symbol (|) and may have optional arguments in parentheses. Tests can be used to test a variable against a common expression. 

Some of the filters available are below, for a full list review here.

abs() format() batch() indent() int()
sum() max() min() pprint() random()
map() upper() slice() sort() string()

Some of the tests available are below, for a full list review here.

boolean()

even() in() mapping() sequence()
callable() false() integer() ne() string()
defined() filter() iterable() none() test()


For example, you might want to dynamically generate job cards for JCL (Job Control Launguage). In the playbook snippet, the vars are statically defined in this example, but they can come from dynamic sources such as the zos_gather_facts module. The Jinja snippet is from a larger template JCL.j2 that will be later contributed to our playbook repository, and the zos_job_submit task is using the modules new use_template option to dynamically populate the template. 

Vars:

vars:
  sh_program_name: "UPTIME"
  programmer: "IBMUSER"


Jinja template snippet from 'JCL.j2':

//{{ sh_program_name }} JOB T043JM,JM00,1,0,0,0,'{{ programmer }}',


Playbook task:

- name: Submit JCL and populate job card using Jinja.
   ibm.ibm_zos_core.zos_job_submit:
      src: "{{ playbook_dir }}/files/JCL.j2"
      location: LOCAL
      use_template: true
   register: job_output

Enhancements

Of our substantial list of 22 modules, 7 were enhanced and we will discuss the most noteworthy changes.

zos_copy now displays the data set attributes for data sets that are created by the module. This is helpful when the expectation is that the module will compute the necessary data set attributes and create it. The module has also been updated to no longer perform automatic recovery backups during the module's life cycle. If data integrity is critical, please define the backup options provided in the module. 

zos_data_set introduced record format F (fixed) where one physical block on disk is one logical record and all the blocks and records are the same size.

 zos_job_output, zos_job_query, zos_job_submit now display job information 'asid', 'creation date', 'creation time', 'job class', 'priority', 'queue position', 'service class' and conditionally 'program name' (when ZOAU is v1.2.4 or later)

Bugfixes
Several of the modules included fixes that we will briefly discuss, but for a complete list, please review the release notes or changelog.
zos_copy corrected the behavior where subdirectories would not be properly encoded and in other cases when the mode was set, it would not be applied to existing files and directories. In the past, if a data set was in use or accessed by another process, the module would not be able to obtain a lock to perform the copy but still report a successful transfer. Since copying to data sets in use by another process is not supported in this release, the module was enhanced to check if a data set in use, and if so, gracefully and properly exit the modules execution.
zos_data_set corrected a randomly occurring case where components of a VSAM cluster would be left behind when asked to be deleted when state present = absent.
zos_fetch, and zos_copy both corrected a 'play_context.verbosity' warning that would appear on newer versions of Ansible. 

zos_operator behavior was updated to no longer scan for keywords such as 'invalid' or 'error' in favor of providing a transparent interaction and encouraging the automation to review the results and decide if a module should error. It is important to note, that without the keyword detection, what would have been considered an error in prior versions is automations responsibility to register the result and evaluate if it's an error or not. To automate keyword detection, you will need to ignore the failed commandregister the tasks output and parse it with a regular expression search , for example;  search for 'invalid' and ignore case and place it into the var cmd_words.

cmd_words: "{{ cmd_output | regex_search('(?i)invalid') }}"

Then combine the cmd_words with a conditional such as the when statement and use the fail module to evaluate the result.

- fail:
    msg: 'Variable cmd_words not defined'
  when: cmd_words is defined

Deprecations

Although these collections date back to our first releases, they served us well and we are officially deprecating them from our offering. IBM Ansible z/OS core collection versions '1.0.0', '1.1.0' and '1.2.1' will no longer be supported with S&S or by the community and will be removed from documentation

In other deprecations, the IBM z/OS core collection will no longer support ansible 2.9 (AAP verson 1.2)ansible-core 2.11 (AAP version 2.0) , ansible-core 2.12 (AAP version 2.1) and ansible-core 2.13 (AAP version 2.2) . Another resource to refer to is the Ansible porting guide that can inform you of known issues, breaking changes, collection changes and deprecations.

You might be wondering why we are deprecating ansible 2.9 (AAP verson 1.2) now, when in the communities support matrix its been end of life (EOL) since last year. Well, our content is certified and that means we align our support to the Ansible Automation Platform (AAP), this offering being downstream from the communities, offers longer support cycles for their enterprise customers. If you would like to be aware of these changes before we announce them, you can review the Red Hat Ansible Automation Platform Life Cycle.

When you review the Red Hat Ansible Automation Platform Life Cycle, find the 'Ansible Automation Platform on-premises included packages and versions' table and look at the first 2 columns, 'AAP version' and 'Execution environments' to correspond the ansible-core version to the AAP version.

 

Then locate the 'Ansible Automation Platform Life Cycle' table and review the 'AAP version' and 'Maintenance support 2 ends' columns to determine when support ends. 

 
About the Author
Demetrios Dimatos is the IBM z/OS Ansible Core Senior Technical Lead with 16 years mainframe experience and over 20 years of development experience; having led multiple products ranging from client server technologies, administration consoles, IBM Open Platform (Hadoop - HDFS, MapReduce, Yarn) and Spark, Linux, and Solaris kernel development. 

Resources

IBM Ansible z/OS core on Galaxy
IBM Ansible Core Collection Repository on GitHub
IBM Ansible Core Collection on Automation Hub
Red Hat® Ansible Certified Content for IBM Z documentation