Building DevOps pipeline for ACE on Operator based release of CP4I

 View Only
Mon August 10, 2020 10:47 AM

IBM Cloud Pak® for Integration V2020.2.1 builds on the previous release and is extended by the following enhancements:

  • Provides Kubernetes Operators that manage every component of the Cloud Pak for Integration, enabling the Cloud Pak to be entirely Kubernetes native, and OpenShift® native. This enables automation at every level of the stack, including the platform itself and the applications that are built on the platform.
  • Simplifies packaging, deployment, management, and updating of Cloud Pak for Integration on the Red Hat® OpenShift Container Platform.
  • Includes the Event Streams Operator that takes advantage of the open source Strimzi Operator published under the Cloud Native Computing Foundation.

I have published several blogs on building CI-CD pipeline for ACE on traditional as well as on Cloud Pak for Integration. This blog is in continuation to that series and addresses building CI-CD pipeline for ACE on Operator based release of Cloud Pak for Integration.

Below are the references of my previous blogs:

Building and Deploying ACE projects with Maven and Jenkins
This blog makes use of the ace-maven-plugin. The source code for the ace-maven-plugin is available at:

ace-maven-plugin

 

An Approach of DevOps for ACE and ACE Designer flows on CP4I:
An approach to build DevOps pipeline for ACE on Cloud Pak for Integration 
An approach to build a DevOps pipeline for IBM App Connect Designer Flows on Cloud Pak for Integration

Building CI-CD Pipeline for IBM App Connect Enterprise on Cloud Pak for Integration

This blog assumes that you have already installed the ACE operator and created an ACE Dashboard in the target namespace. You can look at the Knowledge Center documentation below to understand the APIs/Operands provided by the ACE Operator:

https://www.ibm.com/support/knowledgecenter/SSTTDS_11.0.0/com.ibm.ace.icp.doc/certc_install_appconnectcomponents.html



Click on this link to view the demo


Scenario:

A customer wants to automate the build and release process for IBM App Connect Enterprise to deploy on CP4I 2020.2 onwards. They have considered using:

  • CP4I 2020.2.x
  • Git as source control
  • Nexus to store versioned BAR files
  • Jenkins as the CI tool

The diagram below depicts their target state.Screenshot_2020-08-10_at_14_07_14.png

Note that you might have the multiple logical ACE environments on the same CP4I cluster isolated by namespaces or different ACE environments could be on different CP4I clusters spread across private or public clouds.

As you can see, this diagram depicts a basic DevOps flow. Lots of interesting things can be done on top of this, such as:

  • Implement continuous testing
  • Automatically rollback to previous successful release if test fails
  • Automatically Create issue in a bug tracking tool, like JIRA, for failures and assign to a developer
  • Further enhance it to DevSecOps by introducing security test and so on.

One of the major benefits of moving to containers is that it eliminates the ‘compatibility issues’. It completely eliminates ‘it works on my machine’ problems. You can test your application with the same image locally that you are going to use in live environments. You would typically pull the respective ACE image from the container registry and deploy the app on your local workstation. You can attach debugger for your flows running on containers on your local workstation. You could follow the below Github documentation to deploy ACE containers on your local workstation for development, testing and debugging as well.

https://github.com/ot4i/ace-docker

Let us see how this basic DevOps flow can be implemented. In most of the scenarios:

  1. Developer will check-in the code after testing it locally with same image
  2. There could be a manual or automated build trigger for Jenkins build job
  3. Jenkins build job will compile the ACE application/service and create versioned BAR files for respective environments by taking configuration values from respective environment’s properties files and tag the source code with the respective version number
  4. Jenkins Build Job will store these versioned BAR files into a repository, say Nexus
  5. There could be a manual or automated trigger to Jenkins deployment job for DEV environment
  6. Jenkins DEV deployment job will pull BAR file(s) from repository (here Nexus), create an image with BAR file(s) baked into base ACE image and push the image into OCP registry of CP4I
  7. Jenkins DEV deployment job will create configuration resources with required configurations for the Integration Server, perform deployment from the created DEV image and will configure other objects like Horizontal autoscalar policy, Route etc.
  8. Upon successful testing in DEV env, there could be a trigger to Jenkins QA deployment Job. The similar steps (6 and 7) will be performed for QA environment.
  9. After validation and all required testing, similar steps (6 and 7) will be performed for Production environment.

You could create the image(s) for your ACE application during the build process itself; however building the image in a separate job, i.e. in a deployment job, has the advantage described below:

  • You have versioned BAR files for respective environments and the source code is tagged with that version. So you can always trace the source code version associated with the respective BAR files.
  • You may include BAR files from one or more build jobs to deploy them in one Integration Server.
  • You have control over the deployment i.e. make deployment manual or automated. As in many scenarios, you may want to do manual testing of the applications before deploying in target environment.

For the scenarios where more than one ACE applications/services need to be deployed in the same Integration Server, the Jenkins deployment job can pull the respective BAR files from the repository (here nexus) and bake them into the ACE image.

The diagram below depicts two ACE applications being built and deployed in an IntegrationServer on CP4I

Screenshot_2020-08-10_at_14_21_12.png

1. Configure Build Server

Ensure that you have configured the build server by following the steps from 1 to 5 from the recipe mentioned below:

Building CI-CD Pipeline for IBM App Connect Enterprise on Cloud Pak for Integration

Note that the instructions in this blog do not use OpenShift plugin and Docker plugin, so you may skip to install these plugins in Jenkins; however the Docker and oc client must be installed on the build server and you should be able to login to the target OCP cluster using the CLI from this server.

2. Configure ACE project and Build Job

For this demonstration, we will create two additional folders inside the ACE project:

– Properties folder: This folder will contain properties files for different environments, e.g. DEV.properties and QA.properties. These properties files would contain the environment specific parameters to be used in the mqsiapplybaroverride command to override node properties and UDPs

– Build folder: This folder will contain the Jenkins pipeline script for the ACE project. So the build script for the ACE project is also checked-in to source control



For this demonstration, I have created two ACE REST APIs that publish/subscribe messages to/from IBM EventStreams deployed on CP4I.

Publisher_api: https://github.com/awasthan/Publisher_api.git

Sum_Service: https://github.com/awasthan/Consumer_API.git

Note that both these projects are maven projects and use the ace-maven-plugin to build and deploy BAR files to the Nexus repository. You can follow the blog mentioned above in references to configure ace-maven-plugin.

There is one more project created to define Kafka policy, which is referenced in the Kafka nodes of above Rest APIs:

Kafka_policy: https://github.com/awasthan/kafka_policy.git

Let us examine the ‘jenkinsfile’ script for the Publisher_api project.



Now let us create the Jenkins build job for this Publisher_api project.

Create a Jenkins pipeline project with the name ‘Publisher_api’.



Specify to read pipeline script from SCM and enter the github project location, credential to access GitHub and path of the jenkinsfile in the project.



Click on Save.

Now build the project so that it uploads the BAR files on the repository and tags the code. The snippet below shows a successful build of the Publisher_api project (Output has been trimmed for brevity).

Started by user anand

Obtained Publisher_api/build/jenkinsfile from git https://github.ibm.com/anand-awasthi/Publisher_api.git

Running in Durability level: MAX_SURVIVABILITY

[Pipeline] Start of Pipeline

[Pipeline] timestamps

[Pipeline] {

[Pipeline] node

21:58:35  Running on Jenkins in /var/lib/jenkins/jobs/Publisher_api/workspace

[Pipeline] {

[Pipeline] wrap

21:58:35  Xvfb starting$ /usr/bin/Xvfb :0 -screen 0 1024x768x24 -fbdir /var/lib/jenkins/xvfb-1-..fbdir1142937909955833988

[Pipeline] {

[Pipeline] stage

[Pipeline] { (Publisher_API - Checkout)

[Pipeline] checkout

[Pipeline] // stage

[Pipeline] stage

[Pipeline] { (Publisher_API - Build)

[Pipeline] withMaven

21:58:36  [withMaven] Options: []

21:58:36  [withMaven] Available options:

21:58:36  [withMaven] using JDK installation provided by the build agent

21:58:36  [withMaven] using Maven installation 'maven'

[Pipeline] {

[Pipeline] sh

21:58:37  + mvn -f Publisher_api/pom.xml -B release:prepare release:perform

21:58:37  ----- withMaven Wrapper script -----

21:58:37  Picked up JAVA_TOOL_OPTIONS: -Dmaven.ext.class.path="/var/lib/jenkins/jobs/Publisher_api/workspace@tmp/withMaven3d9d46eb/pipeline-maven-spy.jar" -Dorg.jenkinsci.plugins.pipeline.maven.reportsFolder="/var/lib/jenkins/jobs/Publisher_api/workspace@tmp/withMaven3d9d46eb"

21:58:37  Apache Maven 3.6.0

21:58:37  Maven home: /usr/share/maven

21:58:37  Java version: 1.8.0_265, vendor: Private Build, runtime: /usr/lib/jvm/java-8-openjdk-amd64/jre

21:58:37  Default locale: en, platform encoding: UTF-8

21:58:37  OS name: "linux", version: "4.15.0-88-generic", arch: "amd64", family: "unix"

21:58:38  [INFO] [jenkins-event-spy] Generate /var/lib/jenkins/jobs/Publisher_api/workspace@tmp/withMaven3d9d46eb/maven-spy-20200809-162838-1643435097943494995579.log.tmp ...

21:58:38  [INFO] Scanning for projects...

21:58:39  [INFO]

21:58:39  [INFO] ---------------------< com.ibm.esb:Publisher_api >----------------------

21:58:39  [INFO] Building Publisher_api 2.0-SNAPSHOT

21:58:39  [INFO] ------------------------------[ ace-bar ]-------------------------------

21:58:39  [INFO]

21:58:39  [INFO] --- maven-release-plugin:2.5.3:prepare (default-cli) @ Publisher_api ---

21:58:40  [INFO] Verifying that there are no local modifications...

21:58:40  [INFO]   ignoring changes on: **/pom.xml.releaseBackup, **/pom.xml.next, **/pom.xml.tag, **/pom.xml.branch, **/release.properties, **/pom.xml.backup

21:58:40  [INFO] Executing: /bin/sh -c cd /var/lib/jenkins/jobs/Publisher_api/workspace/Publisher_api && git rev-parse --show-toplevel

21:58:40  [INFO] Working directory: /var/lib/jenkins/jobs/Publisher_api/workspace/Publisher_api

21:58:40  [INFO] Executing: /bin/sh -c cd /var/lib/jenkins/jobs/Publisher_api/workspace/Publisher_api && git status --porcelain .

21:58:40  [INFO] Working directory: /var/lib/jenkins/jobs/Publisher_api/workspace/Publisher_api

workspace/Publisher_api/properties/DEV.properties

22:00:51  [INFO] [INFO] BIP1138I: Applying overrides using runtime mqsiapplybaroverride...

22:00:52  [INFO] [INFO] BIP1140I: Overriding property gen.Publisher_api#additionalInstances with '0' in 'Publisher_api.appzip/META-INF/broker.xml' ...

22:00:52  [INFO] [INFO] BIP1143I: Saving Bar file /var/lib/jenkins/jobs/Publisher_api/perform-workspace/Publisher_api/target/ace/DEV_2.0.bar...

22:00:52  [INFO] [INFO]

22:00:52  [INFO] [INFO] BIP8071I: Successful command completion.

22:00:52  [INFO] [INFO]

22:00:52  [INFO] [INFO] --- ace-maven-plugin:11.38:clean-bar-build-workspace (default-clean-bar-build-workspace) @ Publisher_api ---

22:00:52  [INFO] [INFO] debugWorkspace enabled - workspace will not be cleaned

22:00:52  [INFO] [INFO]

22:00:52  [INFO] [INFO] --- ace-maven-plugin:11.38:validate-classloader-approach (default-validate-classloader-approach) @ Publisher_api ---

22:00:52  [INFO] [INFO] Reading configurable properties from: /var/lib/jenkins/jobs/Publisher_api/perform-workspace/Publisher_api/target/ace/Publisher_api.properties

22:00:52  [INFO] [INFO]

22:00:52  [INFO] [INFO] --- ace-maven-plugin:11.38:package-ace-bar (default-package-ace-bar) @ Publisher_api ---

22:00:52  [INFO] [INFO] Building zip: /var/lib/jenkins/jobs/Publisher_api/perform-workspace/Publisher_api/target/Publisher_api-2.0.zip

22:00:52  [INFO] [INFO]

22:00:52  [INFO] [INFO] --- maven-source-plugin:3.0.0:jar-no-fork (attach-sources) @ Publisher_api ---

22:00:52  [INFO] [INFO] Building jar: /var/lib/jenkins/jobs/Publisher_api/perform-workspace/Publisher_api/target/Publisher_api-2.0-sources.jar

22:00:52  [INFO] [INFO]

22:00:52  [INFO] [INFO] --- maven-javadoc-plugin:2.10.3:jar (attach-javadocs) @ Publisher_api ---

22:00:53  [INFO] [INFO] Not executing Javadoc as the project is not a Java classpath-capable package

22:00:53  [INFO] [INFO]

22:00:53  [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ Publisher_api ---

22:00:53  [INFO] [INFO] No primary artifact to install, installing attached artifacts instead.

22:00:53  [INFO] [INFO] Installing /var/lib/jenkins/jobs/Publisher_api/perform-workspace/Publisher_api/pom.xml to /var/lib/jenkins/.m2/repository/com/ibm/esb/Publisher_api/2.0/Publisher_api-2.0.pom

22:00:53  [INFO] [INFO] Installing /var/lib/jenkins/jobs/Publisher_api/perform-workspace/Publisher_api/target/Publisher_api-2.0.zip to /var/lib/jenkins/.m2/repository/com/ibm/esb/Publisher_api/2.0/Publisher_api-2.0.zip

22:00:53  [INFO] [INFO] Installing /var/lib/jenkins/jobs/Publisher_api/perform-workspace/Publisher_api/target/Publisher_api-2.0-sources.jar to /var/lib/jenkins/.m2/repository/com/ibm/esb/Publisher_api/2.0/Publisher_api-2.0-sources.jar

22:00:53  [INFO] [INFO]

22:00:53  [INFO] [INFO] --- maven-deploy-plugin:2.8.2:deploy (default-deploy) @ Publisher_api ---

22:00:53  [INFO] [INFO] No primary artifact to deploy, deploying attached artifacts instead.

22:00:53  [INFO] [INFO] Uploaded to releases: http://127.0.0.1:8044/nexus/repository/releases/com/ibm/esb/Publisher_api/2.0/Publisher_api-2.0.zip (9.3 kB at 274 kB/s)

22:00:53  [INFO] [INFO] Uploading to releases: http://127.0.0.1:8044/nexus/repository/releases/com/ibm/esb/Publisher_api/2.0/Publisher_api-2.0-sources.jar

22:00:53  [INFO] [INFO] Uploaded to releases: http://127.0.0.1:8044/nexus/repository/releases/com/ibm/esb/Publisher_api/2.0/Publisher_api-2.0-sources.jar (33 kB at 540 kB/s)

22:00:53  [INFO] [INFO] ------------------------------------------------------------------------

22:00:53  [INFO] [INFO] BUILD SUCCESS

22:00:53  [INFO] [INFO] ------------------------------------------------------------------------

22:00:53  [INFO] [INFO] Total time:  58.922 s

22:00:53  [INFO] [INFO] Finished at: 2020-08-09T16:30:53Z

22:00:53  [INFO] [INFO] ------------------------------------------------------------------------

22:00:53  [INFO] [INFO] [jenkins-event-spy] Generated /var/lib/jenkins/jobs/Publisher_api/workspace@tmp/withMaven3d9d46eb/maven-spy-20200809-162954-4558739549619705143723.log

22:00:53  [INFO] Cleaning up after release...

22:00:53  [INFO] ------------------------------------------------------------------------

22:00:53  [INFO] BUILD SUCCESS

22:00:53  [INFO] ------------------------------------------------------------------------

22:00:53  [INFO] Total time:  02:15 min

22:00:53  [INFO] Finished at: 2020-08-09T16:30:53Z

After successful completion of the pipeline, the versioned BAR files are pushed to the Nexus repository and release is tagged in github.

3. Configure Deployment Job

Now configure the deployment jobs for each environment. You may have a single parameterized job to deploy to a different CP4I environment; however due to security and compliance reasons, you would probably have different deployment jobs for respective environments.

We can have a deployment project containing Dockerfile, Jenkinsfile, IntegrationServer CR yaml, IntegrationServer Configuration CRs yamls and IntegrationServer Configuration input files as depicted in below diagram.


I have a sample deployment job that follows the above project structure.

https://github.com/awasthan/Kafka_APIs_Operator_Deployment.git

Below is the structure of the sample deployment project Kafka_APIs_Operator_Deployment.

Kafka_APIs_Operator_Deployment

This sample deployment project performs the tasks below:

  • Create IntegrationServer Configuration CR for Truststore
  • Create IntegrationServer Configuration CR for Setdbparms
  • Create IntegrationServer Configuration CR for Policy project ( In this example Kafka Policy)
  • Create IntegrationServer Configuration CR for Server.conf.yaml
  • Reference these CRs into the IntegrationServer CRD (Kafka_APIs.yaml) and deploy it

The ConfigurationInputs folder contains the below files:

ConfigurationInputs/es-cert.p12

Either PKCS12 or JKS truststores can be configured. Since APIs contained in this project, connect to IBM EventStreams, this es-cert.p12 has been downloaded from IBM EventStreams. Content of this file is base64 encoded and seeded into truststore.yaml before creating the CR for truststore. This step is being done in jenkinsfile during deployment.

ConfigurationInputs/kafka_policy.zip

This is the zipped policy project that may contain one or more policies. In this case, it contains kafka_policy, which is referenced in Kafka Producer and Consumer nodes of the API flows. The kafka_policy project is also checked into github for the reference: https://github.com/awasthan/kafka_policy.git. Content of this file is base64 encoded and seeded into policyProject.yaml before creating the CR for policy project. This step is being done in jenkinsfile during deployment.

ConfigurationInputs/setdbparms.txt

This file contains the parameters for security identifiers. In this case it contains the password to open the truststore and password for the ace-consumer to connect to the Kafka cluster (EventStreams). Content of this file is base64 encoded and seeded into setdbparms.yaml before creating the CR for policy project. This step is being done in jenkinsfile during deployment.

ConfigurationInputs/server.conf.yaml

This file contains the configuration parameters for IntegrationServer. In this case, only below configuration has been overwritten under ResourceManagers.JVM

Content of this file is base64 encoded and seeded into server.conf.yaml.yaml before creating the CR for policy project. This step is being done in jenkinsfile during deployment.

Below files inside ConfigurationResources folder are the CRDs for Truststore, Policy, Setdbparms and server.conf.yaml. Namespace and Content is being dynamically overwritten in jenkinsfile.

  • policyProject.yaml
  • server.conf.yaml
  • setdbparms.yaml
  • truststore.yaml

Below file is the CRD for IntegrationServer and referenced to the above four CRs:

  • Kafka_APIs_IS.yaml

The jenkinsfile performs below steps:

  • Checks-out the project
  • Pulls the BAR files for the mentioned two ACE APIs from the Nexus repository
  • Creates a custom image by baking in BAR files using the Dockerfile
  • Pushes the baked image to OCP registry
  • Creates the four CRs for the IntegrationServer
  • Creates the CR for the IntegrationServer

The script handles the install and upgrade of IntegrationServer.

Create a parameterized Jenkins pipeline job to build this deployment project. Look at the parameters referenced in the Jenkins file and define them in the job like below:



Pipeline script has been configured to pull from Github.



Similarly you can configure deployment job for QA environment and so on.

4. Perform Deployment

Now let us deploy an IntegrationServer that contains version 2.0 of Publisher_api and version 2.1 for Consumer_API.

Click on ‘Build with Parameters’ for the deployment job.



Click on Build.
Below is the snippet of build logs.

Started by user anand

Obtained jenkinsfile from git https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git

Running in Durability level: MAX_SURVIVABILITY

[Pipeline] Start of Pipeline

[Pipeline] timestamps

[Pipeline] {

[Pipeline] node

10:26:07  Running on Jenkins in /var/lib/jenkins/jobs/Kafka_APIs_Operator_Deployment/workspace

[Pipeline] {

[Pipeline] wrap

10:26:07  Xvfb starting$ /usr/bin/Xvfb :0 -screen 0 1024x768x24 -fbdir /var/lib/jenkins/xvfb-15-..fbdir5779774823297484136

[Pipeline] {

[Pipeline] stage

[Pipeline] { (Kafka_APIs_Operator_Deployment - Checkout)

[Pipeline] checkout

10:26:08  using credential jenkins-token

10:26:08  Cloning the remote Git repository

10:26:08  Cloning repository https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git

10:26:08   > /usr/bin/git init /var/lib/jenkins/jobs/Kafka_APIs_Operator_Deployment/workspace # timeout=10

10:26:08  Fetching upstream changes from https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git

10:26:08   > /usr/bin/git --version # timeout=10

10:26:08  using GIT_ASKPASS to set credentials

10:26:08   > /usr/bin/git fetch --tags --progress -- https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git +refs/heads/*:refs/remotes/origin/* # timeout=10

10:26:08   > /usr/bin/git config remote.origin.url https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git # timeout=10

10:26:08   > /usr/bin/git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10

10:26:08   > /usr/bin/git config remote.origin.url https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git # timeout=10

10:26:08  Fetching upstream changes from https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git

10:26:08  using GIT_ASKPASS to set credentials

10:26:08   > /usr/bin/git fetch --tags --progress -- https://github.ibm.com/anand-awasthi/Kafka_APIs_Operator_Deployment.git +refs/heads/*:refs/remotes/origin/* # timeout=10

10:26:08   > /usr/bin/git rev-parse refs/remotes/origin/master^{commit} # timeout=10

10:26:08   > /usr/bin/git rev-parse refs/remotes/origin/origin/master^{commit} # timeout=10

10:26:08  Checking out Revision 06350b7a4c95f5dfcdeec5842c9ca0f90d8c6f83 (refs/remotes/origin/master)

10:26:08   > /usr/bin/git config core.sparsecheckout # timeout=10

10:26:08   > /usr/bin/git checkout -f 06350b7a4c95f5dfcdeec5842c9ca0f90d8c6f83 # timeout=10

10:26:08  Commit message: "Update README.md"

10:26:08   > /usr/bin/git rev-list --no-walk 3710b2abbdb6276b4683b9e0b37b5d6f10bf2cdf # timeout=10

[Pipeline] }

[Pipeline] // stage

[Pipeline] stage

[Pipeline] { (Kafka_APIs_Operator_Deployment - Build)

[Pipeline] artifactResolver

10:26:08  INFO: define repo: [Repository id=nexus-release, type=default, url=http://127.0.0.1:8044/nexus/repository/releases/, isRepositoryManager=false]

10:26:08  INFO: set authentication for admin

10:26:08  deleted file:/var/icp-builds/Publisher_api/Publisher_api-2.0.zip

10:26:08  copy /var/nexus_repo/com/ibm/esb/Publisher_api/2.0/Publisher_api-2.0.zip to file:/var/icp-builds/Publisher_api/Publisher_api-2.0.zip

10:26:08  deleted file:/var/icp-builds/Publisher_api/Consumer_API-2.1.zip

10:26:08  copy /var/nexus_repo/com/ibm/esb/Consumer_API/2.1/Consumer_API-2.1.zip to file:/var/icp-builds/Publisher_api/Consumer_API-2.1.zip

[Pipeline] sh

10:26:09  Archive:  Publisher_api-2.0.zip

10:26:09   extracting: DEV_2.0.bar            

10:26:09    inflating: Publisher_api.properties 

10:26:09  Archive:  Consumer_API-2.1.zip

10:26:09    inflating: Consumer_API.properties 

10:26:09   extracting: DEV_2.1.bar            

10:26:13  Login successful.

10:26:13 

10:26:13  You have access to 64 projects, the list has been suppressed. You can list all projects with 'oc projects'

10:26:13 

10:26:13  Using project "integration".

10:26:13  Already on project "integration" on server "https://c107-e.us-south.containers.cloud.ibm.com:32743".

10:26:14  WARNING! Using --password via the CLI is insecure. Use --password-stdin.

10:26:14  WARNING! Your password will be stored unencrypted in /var/lib/jenkins/.docker/config.json.

10:26:14  Configure a credential helper to remove this warning. See

10:26:14  https://docs.docker.com/engine/reference/commandline/login/#credentials-store

10:26:14 

10:26:14  Login Succeeded

10:26:14  Sending build context to Docker daemon  158.2kB

 

10:26:14  Step 1/4 : FROM default-route-openshift-image-registry.gsidemos-7ec5d722a0ab3f463fdc90eeb94dbc70-0000.us-south.containers.appdomain.cloud/integration/ace-server-prod:11.0.0.9-r2

10:26:14   ---> dab820271458

10:26:14  Step 2/4 : COPY *DEV*.bar /home/aceuser/initial-config/bars/

10:26:14   ---> Using cache

10:26:14   ---> 178ab2b17551

10:26:14  Step 3/4 : EXPOSE 7600 7800 7843 9483

10:26:14   ---> Using cache

10:26:14   ---> c3b00bf021c5

10:26:14  Step 4/4 : ENV LICENSE accept

10:26:14   ---> Using cache

10:26:14   ---> c2e61b104b30

10:26:14  Successfully built c2e61b104b30

10:26:14  Successfully tagged mykafkaimage:15

10:26:14  The push refers to repository [default-route-openshift-image-registry.gsidemos-7ec5d722a0ab3f463fdc90eeb94dbc70-0000.us-south.containers.appdomain.cloud/integration/mykafkaimage]

10:26:14  a3e328080f01: Preparing

10:26:14  89d0c35052be: Preparing

10:26:14  46445c823def: Preparing

10:26:14  83f4ee739c97: Preparing

10:26:14  69b3e0170aca: Preparing

10:26:14  613f5a48cc81: Preparing

10:26:14  51128e8189fc: Preparing

10:26:14  613f5a48cc81: Waiting

10:26:14  3a945efe0516: Preparing

10:26:14  ce01644b3aef: Preparing

10:26:14  1d78c1cf5e77: Preparing

10:26:14  14fe86339480: Preparing

10:26:14  51128e8189fc: Waiting

10:26:14  1540698d9e18: Preparing

10:26:14  14f516b1896e: Preparing

10:26:14  ce01644b3aef: Waiting

10:26:14  3a945efe0516: Waiting

10:26:14  96ed4be0412c: Preparing

10:26:14  1d78c1cf5e77: Waiting

10:26:14  1540698d9e18: Waiting

10:26:14  14f516b1896e: Waiting

10:26:14  96ed4be0412c: Waiting

10:26:14  e40686b6aa31: Preparing

10:26:14  1e3f45fc481e: Preparing

10:26:14  b6b26102656c: Preparing

10:26:14  14fe86339480: Waiting

10:26:14  850dd45d18b2: Preparing

10:26:14  f6a7bceff83b: Preparing

10:26:14  458299b871b5: Preparing

10:26:14  7f754f02d368: Preparing

10:26:14  e40686b6aa31: Waiting

10:26:14  1e3f45fc481e: Waiting

10:26:14  850dd45d18b2: Waiting

10:26:14  b6b26102656c: Waiting

10:26:14  f6a7bceff83b: Waiting

10:26:14  458299b871b5: Waiting

10:26:14  7f754f02d368: Waiting

10:26:14  89d0c35052be: Layer already exists

10:26:14  a3e328080f01: Layer already exists

10:26:14  83f4ee739c97: Layer already exists

10:26:14  69b3e0170aca: Layer already exists

10:26:14  46445c823def: Layer already exists

10:26:14  613f5a48cc81: Layer already exists

10:26:14  51128e8189fc: Layer already exists

10:26:14  3a945efe0516: Layer already exists

10:26:14  1d78c1cf5e77: Layer already exists

10:26:14  ce01644b3aef: Layer already exists

10:26:14  14fe86339480: Layer already exists

10:26:14  1540698d9e18: Layer already exists

10:26:14  14f516b1896e: Layer already exists

10:26:14  96ed4be0412c: Layer already exists

10:26:14  e40686b6aa31: Layer already exists

10:26:14  1e3f45fc481e: Layer already exists

10:26:14  b6b26102656c: Layer already exists

10:26:14  f6a7bceff83b: Layer already exists

10:26:14  850dd45d18b2: Layer already exists

10:26:14  458299b871b5: Layer already exists

10:26:14  7f754f02d368: Layer already exists

10:26:15  1.0: digest: sha256:c924568cadd9a709f9cd954265394c32f8c434f158e3816041ed1e307d0ed6c2 size: 4718

10:26:16  configuration.appconnect.ibm.com/setdbparms-conf created

10:26:17  configuration.appconnect.ibm.com/es-cert.p12 created

10:26:18  configuration.appconnect.ibm.com/policy-conf created

10:26:19  configuration.appconnect.ibm.com/server-conf created

10:26:20  integrationserver.appconnect.ibm.com/kafkaserver created

[Pipeline] }

[Pipeline] // stage

[Pipeline] }

10:26:20  Xvfb stopping

[Pipeline] // wrap

[Pipeline] cleanWs

10:26:21  [WS-CLEANUP] Deleting project workspace...

10:26:21  [WS-CLEANUP] Deferred wipeout is used...

10:26:21  [WS-CLEANUP] done

[Pipeline] }

[Pipeline] // node

[Pipeline] }

[Pipeline] // timestamps

[Pipeline] End of Pipeline

Finished: SUCCESS

Let us look at the IntegrationServer and IntegrationServer CustomResources that have been created.



Notice the created Configurations and IntegrationServer resource. Also note the ‘CUSTOMIMAGES’ column value.

Let us take a look at the ACE Dashboard.







Attached below are the resources used in this demo scenario.

Consumer_API.zip
Kafka_APIs_Operator_Deployment.zip
kafka_policy.zip
Publisher_api.zip


#AppConnectEnterprise(ACE)
#IBMCloudPakforIntegration(ICP4I)
#DevOps

Comments

Tue February 09, 2021 11:31 PM

Hi Soukaina,
     You need to have 'Xvfb' installed on the server where Jenkins build runs. The error suggests that its not able to find Xvfb.

Tue February 09, 2021 03:41 AM

Actually when i build the project Publisher_API in jenkins i got this error:
https://github.com/soukainakhalkhouli/Publisher_API.git
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] timestamps
[Pipeline] {
[Pipeline] node
08:38:22  Running on Jenkins in /var/lib/jenkins/jobs/Publisher_api/workspace
[Pipeline] {
[Pipeline] wrap
08:38:22  Xvfb starting$ Xvfb :0 -screen 0 1024x768x24 -fbdir /var/lib/jenkins/xvfb-3-..fbdir5185204646674115253
[Pipeline] // wrap
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // timestamps
[Pipeline] End of Pipeline
java.io.IOException: error=2, No such file or directory
	at java.base/java.lang.ProcessImpl.forkAndExec(Native Method)
	at java.base/java.lang.ProcessImpl.<init>(ProcessImpl.java:340)
	at java.base/java.lang.ProcessImpl.start(ProcessImpl.java:271)
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1107)
Caused: java.io.IOException: Cannot run program "Xvfb": error=2, No such file or directory
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1128)
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1071)
	at hudson.Proc$LocalProc.<init>(Proc.java:252)
	at hudson.Proc$LocalProc.<init>(Proc.java:221)
	at hudson.Launcher$LocalLauncher.launch(Launcher.java:936)
	at hudson.Launcher$ProcStarter.start(Launcher.java:454)
	at org.jenkinsci.plugins.xvfb.Xvfb.launchXvfb(Xvfb.java:587)
	at org.jenkinsci.plugins.xvfb.Xvfb.setUp(Xvfb.java:697)
	at org.jenkinsci.plugins.workflow.steps.CoreWrapperStep$Execution2.doStart(CoreWrapperStep.java:97)
	at org.jenkinsci.plugins.workflow.steps.GeneralNonBlockingStepExecution.lambda$run$0(GeneralNonBlockingStepExecution.java:77)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)
Finished: FAILURE