IBM DevOps Deploy for DataPower Gateway Deployments
Overview
Bringing DevOps to DataPower implementations can be a challenge for teams that manage integration products used by the enterprise. In this blog, we look at an example of how one could apply DevOps to DataPower using git (GitHub) and IBM DevOps Deploy where you have a good trail of audit, deployment history, and governance features to ensure changes are introduced in the best way possible.
What is a DataPower Gateway?
The IBM DataPower Gateway brings enterprise-grade security, control, and comprehensive transport and data protocol support to meet the needs of businesses all across the globe. The DataPower Gateway helps to mitigate vulnerabilities of business-critical applications exposed on the web.
A DataPower Gateway provides the following key features:
- Runtime application security to enforce security across messaging protocols
- Transforms messages and protocols
- Ensures application and service protection
It is a very powerful solution that businesses depend on including 400+ financial institutions.
What is IBM DevOps Deploy?
The IBM DevOps Deploy solution supports application release automation for many different types of applications including DataPower. To support different technologies, DevOps Deploy has a plugin framework which allows you to load plugins to meet your specific needs. In this case, you can download the DataPower plugin and start using it to create automated deployment processes. You can load the DataPower plugin in the 'Automation Plugins' page as shown below:
Here is an example component process using the DataPower plugin:
Each DataPower plugin step communicates with the DataPower Gateway using the XML Management Interface (XMI) port as shown below:
In your DataPower Gateway web console, in the default domain, configure your XML management interface to be enabled and check the box for “WS-Management endpoint” as shown below:
By default, the XMI port is 5550 as shown above. This setup is a requirement for the DevOps Deploy plugin steps to work. Without this, you will get a “connection refused” error message when you try to run any automated deployments.
An example process for DataPower using Git and DevOps Deploy
In this example, I have setup a process that starts with a request file that is checked into a git repository. We configured request file templates that can be used by developers as a starting point as you prepare the request. Here are the request file templates we created for our use cases:
- PassThru MPG – a simple pass-through multi-protocol gateway
- Generic MPG – a generic multi-protocol gateway that utilizes a configuration file from the DataPower filesystem
- Config File Upload
- Export and Import
- AWS2Channel
- Channel2AWS
Here is an example of what a request file looks like:
## --------------------------------
## Example FileUpload Request File
## ---------------------------------
Filename = 000000000002.req
Requester = rlange@ibm.com
TargetDomain = DEV
TemplateName = GenericMPG
JiraTicket = <url to jira ticket>
## ------------------------------------------------------
## Parameters for the FileUpload (Generic) Template
## The parameters must match to the template you provide
##
## Place your generic artifacts in this folder structure:
## dp_projects/<projectNumber>/artifacts
## ------------------------------------------------------
SERVICE_NAME = AWSConsumerMPG
FROM_DIRECTORY = artifacts/AWSConsumerMPG/DEV/ALERTS
TO_DIRECTORY = local:///AWSConsumerMPG/ALERTS
The automated deployment process in IBM DevOps Deploy will read this file, set properties used by the component process steps, and ultimately deploy the update to the DataPower Gateway for the target domain that you specify.
Here is an example DevOps Deploy component process that will read a individual request file and then initiate the proper deployment process based on the template specified:
Our process was configured to allow one to submit multiple request files at a time for deployment processing if required. The process built in DevOps Deploy will run each deployment request in parallel.
To help process the requests, we have configured an “AUTODP” environment where incoming requests are processed as shown below:
The actual deployments will target environments like DEV or QA (see above) which will deploy updates to your corresponding DataPower Domains with the same name where deployment will happen. It is a best practice to have your Domain names in DataPower match the environment names in DevOps Deploy.
Deployment Example
In this example, I will start as a developer and create two new request files. One request will be for a "ConfigFileUpload" and another will be for a "Generic MPG" deployment to the DEV domain for our DataPower Gateway. The request files (*.req) are checked into a git repository as shown below:
I have setup a GitHub action that will run at time of code commit which will notify DevOps Deploy of the new component version ready for deployment:
If you would like to learn more about our GitHub action that integrates with DevOps Deploy, please read our example usage page found here: https://github.com/HCL-TECH-SOFTWARE/devops-deploy-createcomponentversion-action.
To setup this integration, you will create a file named .github/workflows/CreateDevOpsDeployCompVersion.yml in your GitHub repository. The following inputs are required for the action:
- component – the name or ID of the component in DevOps Deploy
- hostname – the hostname or IP address of the DevOps Deploy Server
- port – the port number for your DevOps Deploy server. The default is 8443.
- authToken – the token value used to authenticate with the DevOps Deploy server. The example uses a secret named DEVOPS_DEPLOY_AUTHTOKEN which is defined in the GitHub repository at Settings->Secrets and variables->Actions.
This action will be triggered when changes are pushed to the main branch of your repository. When triggered, this workflow will cause a new DevOps Deploy component version to be created. When the new component version is added, this in turn can be setup to trigger the application process that will perform the deployment.
In this scenario, we see that there is a new component version for our AUTODP-REQUEST-ENGINE component as shown below:
As mentioned above, when a new component version comes in, you can setup DevOps Deploy to automatically trigger a deployment by editing the configuration settings for your target environment. In my case, I am using the AUTODP environment to process all the incoming requests:
As the deployment starts, you will see that the AUTODP-REQUEST-ENGINE component process will find the new request files and trigger deployments for each request in parallel as shown below. If you navigate to the Dashboard, you can see both deployments that are currently running:
For each deployment, you can click a link to view each request in more detail to see the progress of the automated process. Here is an example for one of the deployments:
If we view the output log for the “Upload Directory” step, we see it interacting with the DataPower Gateway using a DataPower plugin step:
In the example above, it uploaded a file to the DEV domain successfully.
upload-dir:
[dpupload] uploading file "/opt/IBM/devops-deploy/agent/var/work/AUTODP-MODEL-DEPLOY/artifacts/mpg/AWSProxyMPG/ISAM2AWSProxyMPG/config/CUSTOMERREFERENCEDATAMANAGEMENT/CUSTOMERREFERENCEDATAMANAGEMENT.cfg" to "local:///mpg/AWSProxyMPG/ISAM2AWSProxyMPG/config/CUSTOMERREFERENCEDATAMANAGEMENT/CUSTOMERREFERENCEDATAMANAGEMENT.cfg" in domain DEV
BUILD SUCCESSFUL
Total time: 1 second
We can confirm that it uploaded the configuration file successfully to the DataPower Gateway by logging into the DataPower Gateway web console as shown below:
Conclusion
As you have observed from this blog, it is possible to bring DevOps processes to integration solutions like the DataPower Gateway. We leveraged best of breed solutions like git and DevOps Deploy to provide an automated deployment methodology which ensures a great trail of audit, governance controls that can be put in place to meet your needs, and even wired in automated testing to ensure good quality code progresses through your software delivery lifecycle.
For more information on how DevOps Deploy can support other integration products like App Connect Enterprise (ACE) and API Connect, please see this blog for more details.
IBM DevOps Deploy is a solution leveraged by our clients all across the globe and is very strong in supporting deployments to the mainframe, to on-premise solutions, to containers running in Kubernetes, to DataPower Gateways. If you are interested in learning more about the solution, please reach out to your local IBM representative and we would be pleased to provide an overview and demonstration of DevOps Deploy for you and your team.
I would like to provide a special thank you to @asim saddal and @Sudhakar Bodapati for their contributions to this article.
#Highlights-home