Installing IBM Event Streams on EKS
In this blog post, we will walk through the installation of IBM Event Streams, a powerful event-streaming platform built on Apache Kafka, on Amazon EKS (Elastic Kubernetes Service). EKS is a managed Kubernetes service that simplifies deploying, managing, and scaling containerized applications. This guide will cover both online and air-gapped installation methods, utilizing a private image registry for offline installation.
Overview
IBM Event Streams provides a robust platform for building event-driven applications. By leveraging the capabilities of Apache Kafka, it allows organizations to process and analyze real-time data streams efficiently. In this guide, we will utilize the instructions provided in the official IBM documentation found at IBM Event Automation Overview.
Installation Prerequisites
Before we begin, ensure you have the following prerequisites in place:
- EKS Environment: If you are an IBMer, you can reserve an EKS environment from IBM TechZone. Otherwise, make sure you have your EKS environment ready.
- IBM Entitlement Key: You will need an IBM entitlement key to access the necessary images. If you don’t have one, you can retrieve it from the IBM Cloud documentation.
- Required Tools: Ensure you have the following tools installed on your local machine:
- AWS CLI
- kubectl
- Helm
- (For offline installation) IBM Pak and Skopeo
Installation Steps
Step 1: Set Up Your Environment
- Configure AWS CLI: Run the following command to configure your AWS CLI with your credentials:
aws configure
- Update kubectl for EKS: Use the following command to configure kubectl to connect to your EKS cluster:
aws eks update-kubeconfig --region <your-aws-region> --name <your-cluster-name>
Example Output:

Step 2: Online Installation
Step 1: Add IBM Helm Repository
Add the IBM Helm repository to your Helm configuration:
helm repo add ibm-helm https://raw.githubusercontent.com/IBM/charts/master/repo/ibm-helm
helm repo update
Step 2: Create Namespace and Secrets
Create a namespace for Event Streams and a secret for your IBM entitlement key:
kubectl create namespace eks-event-streams
kubectl create secret docker-registry ibm-entitlement-key \
--docker-username=cp \
--docker-password=<your-ibm-entitlement-key> \
--docker-server="cp.icr.io" \
-n eks-event-streams
Step 3: Install Event Streams Operator
Install the Event Streams operator using Helm:
helm install eventstreams-op ibm-helm/ibm-eventstreams-operator -n eks-event-streams
Step 4: Verify Installation
Check the status of the Event Streams operator deployment:
kubectl get deploy eventstreams-cluster-operator -n eks-event-streams -w
Step 3: Offline Installation (Air-Gapped Environment)
For environments without internet access, follow these steps:
Make sure you have your private registry set up - we recommend using Artifactory.
Step 1: Download Event Streams CASE Using IBM Pak Tool
Use the IBM Pak tool to download the Event Streams CASE. The CASE contains all the important info for images, metadata, Helm charts, and more:
oc-ibm_pak get ibm-eventstreams
Step 2: Generate Image Manifests
Generate image manifests for your private registry:
oc-ibm_pak generate mirror-manifests ibm-eventstreams <target-registry>
Step 3: Copy Images to Private Registry
Authenticate to both source and target registries and copy the images. The following command will run Skopeo copy from the IBM registry to your private registry:
skopeo login cp.icr.io -u cp -p <ibm-entitlement-key>
skopeo login <target-registry> -u <username> -p <password>
cat ~/.ibm-pak/data/mirror/ibm-eventstreams/<case-version>/images-mapping.txt | \
awk -F'=' '{ print "skopeo copy --all docker://"$1" docker://"$2 }' | \
xargs -S1024 -I {} sh -c 'echo {}; {}'
Step 4: Install Event Streams Operator from Local Charts
Create a namespace and install the Event Streams operator from the local charts:
kubectl create namespace eks-event-streams-offline
kubectl create secret docker-registry artifactory-secret \
--docker-username=<registry-username> \
--docker-password=<registry-password> \
--docker-server="<your-registry>" \
-n eks-event-streams-offline
helm install eventstreams-op ~/.ibm-pak/data/cases/ibm-eventstreams/3.7.0/charts/ibm-eventstreams-operator-3.7.0.tgz \
-n eks-event-streams-offline \
--set imagePullPolicy="Always" \
--set public.repo=<your-registry> \
--set public.path="cpopen/" \
--set private.repo=<your-registry> \
--set private.path="cp/" \
--set imagePullSecrets=artifactory-secret \
--set watchAnyNamespace=false
Result of Offline Installation:

Step 4: Setup Ingress Controller with NGINX
Step 1: Add NGINX Helm Repository
Add the NGINX Helm repository:
helm repo add ingress-nginx https://kubernetes.github.io/ingress-nginx
helm repo update
Step 2: Create Namespace and Install NGINX
Create a namespace and install the NGINX Ingress Controller:
kubectl create namespace ingress-basic
helm install nginx-ingress ingress-nginx/ingress-nginx \
--namespace ingress-basic \
--set controller.replicaCount=2 \
--set controller.nodeSelector."kubernetes\.io/os"=linux \
--set defaultBackend.nodeSelector."kubernetes\.io/os"=linux \
--set controller.admissionWebhooks.patch.nodeSelector."kubernetes\.io/os"=linux \
--set "controller.extraArgs.enable-ssl-passthrough="
To retrieve the load balancer IP address, run:
kubectl get svc nginx-ingress-ingress-nginx-controller -n ingress-basic
Example Output:

Step 5: Deploy Event Streams Instance
After installing the operator, you can deploy an Event Streams instance by creating a custom resource (CR). Download the example configuration and modify it to suit your environment:
curl -O https://raw.githubusercontent.com/IBM/ibm-event-automation/main/event-streams/cr-examples/eventstreams/kubernetes/development.yaml
Edit the development.yaml
file to update the host with your external IP using nip.io, then apply the configuration.
To retrieve the external IP address from the load balancer, use nslookup
:
nslookup a1b2c3d4-ingress-basic-123456-1234567890.us-east-1.elb.amazonaws.com
Example nslookup
output:

See development.yaml for values to change.
kubectl apply -f development.yaml -n eks-event-streams
Result of Event Streams Instance Deployment:

To verify if the pods are running properly, use:
kubectl get pods -n <namespace>
Example Pod Status Output:

Step 6: Create Admin User
To create an admin user for your Event Streams instance, download the admin user configuration and apply it:
Use this guide as a reference.
curl -O https://raw.githubusercontent.com/IBM/ibm-event-automation/main/event-streams/cr-examples/eventstreams/kubernetes/es-admin-kafkauser.yaml
kubectl apply -f es-admin-kafkauser.yaml -n eks-event-streams
Retrieve Admin Password
To retrieve the admin password, run:
kubectl get secret es-admin -n eks-event-streams -o jsonpath="{.data.password}" | base64 -d
Example output for password retrieval:

The retrieved credentials are:
Username: es-admin
Password: dajqKaulwJDvtjfvpU3H5mnWMZXazOH4
(Note: Remove any '%' from the output if present)
Testing the Installation
Log in to the admin UI found on the deployment.yaml, e.g., https://adminui.<external-ip>.nip.io/
.
Example hostname from deployment.yaml for Admin UI:

Use the retrieved credentials to log in. Once logged in, download the starter application:

Configure a Topic for Testing
From the admin UI, navigate to "Topics" and "Create topic" to configure a topic for testing.

Test with Demo Application
- Download the demo JAR: https://github.com/ibm-messaging/kafka-java-vertx-starter/releases
- Download the properties file from the Event Streams admin UI (Go to "Connect to this cluster" → "Generate credentials").
- Check Java version (must be 8 or 11):
java -version
- Run the demo:
java -Dproperties_path=<path-to-properties> -jar demo-all.jar
Example Demo Run Output:

View the Application
View the application at http://localhost:8080
.

Conclusion
Congratulations! You have successfully installed IBM Event Streams on Amazon EKS. Whether you chose the online or offline installation method, you now have a powerful event-streaming platform at your disposal. You can start building event-driven applications and leverage the capabilities of Apache Kafka to process real-time data streams.
Full repository with files and instructions - https://github.ibm.com/Caleb-Ooi/ibm-event-streams-eks
For further information and advanced configurations, refer to the IBM Event Automation Documentation and the Event Streams CR Examples. Happy streaming! For an example of how to set up on Azure AKS, refer to this guide.