In this post, I'll share how you can get your own Event Endpoint Management demo instance with just seven minutes of work.
(Seven minutes of hands-on-keyboard time... there is a lot of waiting-for-stuff-to-run time, but it doesn't sound so impressive if I include waiting time!)
Why should you use this?
The best way to learn about something new is to get hands-on and play with it.
With something like Event Endpoint Management, to start kicking the tyres you need something to put in it. It starts getting interesting to play with once you have a few streams of events to publish and manage with it. You can really start to see the developer experience that it enables once you have a developer portal populated with a variety of real event sources.
But setting all of that up (creating a Kafka cluster, creating a bunch of Kafka topics, starting applications to produce messages to those topics, writing documentation describing the schemas of those messages, creating Kafka credentials and configuring the gateway with them, etc.) takes a little effort.
The idea of giving you this "demo in a box" is to do all of that for you, so you can dive straight in to playing with it - with hardly any up-front effort.
What do you get?
Before I jump into showing you how to set it up for yourself, let me show you what you'll get.
- A variety of event sources discoverable through a Developer Portal.
- Kafka topics documented using AsyncAPI and published in an API Manager.
- Underlying topics for each of these event sources as a Kafka topic in your own instance of Event Streams, managing a range of Connectors to create the streams of events.
Under the covers, the demo script:
- adds the IBM catalog to your OpenShift cluster
- configures Cloud Pak Foundational Services
- installs Event Streams - and creates a Kafka cluster
- sets up a Kafka Connect cluster
- starts up a variety of Connectors, publishing to Kafka topics
- installs Event Endpoint Management - and sets up a catalog and developer portal
- generates AsyncAPI documentation for all of the topics
- sets up credentials for the Event Gateway and publishes the topics to Event Endpoint Management
Okay, you're sold! How can you get one of these for yourself?
I'll go through what you need to do step by step. For each step, I've recorded a video of me doing it. The 📹 icons in the instructions are links to the relevant part of each video if you just want to see a single step.
They are very boring videos - there isn't any audio, so you're spared any more of my voice for these bits. And I didn't want to speed them up, so you can see exactly how long it takes!
I haven't included time to get the pre-requisites that I'm hoping you'll already have:
- command line tools:
oc
, git
, make
- a Github account
- an IBM id account
- a Twitter account
If you need to go get any of those, then this will take you a little longer than seven minutes, sorry! Hopefully not much longer though.
By the way - I know you'll see a bunch of my passwords and API keys in the videos, but don't worry on my behalf - the cluster I used for the demo has been deleted, and I've revoked all the credentials that you'll see.
Step 0 - get the code
time: |
20 seconds |
pre-requisites: |
git |
Clone the demo setup code from github.com/dalelane/event-endpoint-management-demo.
video recording at youtu.be/5YPYKQHmc60
Step 1 - get an API key for stock price events
time: |
50 seconds |
pre-requisites: |
none |
You need a free API key for the streams of events the demo will set up with stock price changes.
- 📹 0s Find the
template-alphavantage-apikey.yaml
file
- 📹 5s Rename it to
alphavantage-apikey.yaml
- 📹 15s Go to alphavantage and get a free API key
- 📹 37s Update your
alphavantage-apikey.yaml
file to replace the apikey
value with your API key
video recording at youtu.be/7ojNmattrSM
Step 2 - get an access token for github
time: |
60 seconds |
pre-requisites: |
a Github account |
The demo setup automation will grab some resources from the event-endpoint-management-demo on Github, so you need to give it an access token it can use to do that.
- 📹 0s Find the
template-github-credentials.yaml
file
- 📹 5s Rename it to
github-credentials.yaml
- 📹 10s Replace the username value with your Github username
- 📹 20s Go to github.com and create a Personal Access Token (in Settings -> Developer settings)
- 📹 35s Give it a name and an expiration - but don't add any additional scopes
- 📹 50s Update the
github-credentials.yaml
file to replace the password
value with your access token
video recording at youtu.be/zX9YCklRtIg
Step 3 - get an entitlement key for IBM images
time: |
60 seconds |
pre-requisites: |
base64 , an IBM account |
Installing the IBM software used in the demo requires credentials for pulling the docker images from Entitled Registry, so you need to give it an entitlement key it can use to do that.
- 📹 0s Find the
template-ibm-entitlement-key.yaml
file
- 📹 5s Rename it to
ibm-entitlement-key.yaml
- 📹 15s Go to My IBM and copy your Entitlement key
- 📹 22s Paste your entitlement key into the empty password section in the
template-dockerconfig.json
- 📹 35s base64-encode the contents of the
template-dockerconfig.json
file
- 📹 43s Update the
ibm-entitlement-key.yaml
file to replace the .dockerconfigjson
value with your base64-encoded docker config json
video recording at youtu.be/s3AofVrMBX0
Step 4 - get a twitter API key
time: |
80 seconds |
pre-requisites: |
a Twitter account |
You need a free API key for the streams of events the demo will set up with messages from Twitter.
- 📹 0s Find the
template-twitter-apikey.yaml
file
- 📹 3s Rename it to
twitter-apikey.yaml
file
- 📹 12s Go to developer.twitter.com and navigate to the Developer Portal
- 📹 18s Go to your Standalone Apps on your Projects & Apps page
- 📹 23s Create a new standalone app
- 📹 33s Update the
twitter-apikey.yaml
file to replace the consumerKey
value with your API Key
- 📹 41s Update the
twitter-apikey.yaml
file to replace the consumerKeySecret
value with your API Key Secret
- 📹 50s Generate an Access Token and Secret (from App settings -> Keys and tokens -> Access Token and Secret -> Generate)
- 📹 60s Update the
twitter-apikey.yaml
file to replace the accessToken
value with your Access Token
- 📹 67s Update the
twitter-apikey.yaml
file to replace the accessTokenSecret
value with your Access Token Secret
video recording at youtu.be/p4iLElvQCf8
Step 5 - get an OpenShift cluster
(hands-on-keyboard) time: |
90 seconds |
pre-requisites: |
somewhere to create an OpenShift cluster! |
You need somewhere to run the demo!
I used TechZone to reserve a demo cluster on the managed Red Hat OpenShift service on IBM Cloud (ROKS).
- 📹 0s Go to your OpenShift provider
- 📹 68s Make sure you have enough space on your worker nodes for all the demo components
video recording at youtu.be/pol7hXxC2is
When everything is up and running, this is how full my cluster was:
You can tell I was a bit over-generous with the size of the cluster I reserved!
You probably won't get a cluster instantly, so now is a good time to go get yourself a coffee!
Optional step - choose the storage classes to use
I've chosen storage classes to use assuming that you'll run this on ROKS as well. But if you're running OpenShift somewhere else, you'll likely want to choose your own storage class names.
To choose the storage classes used by the Cloud Pak for Integration components installed, edit the cp4i-overrides.yaml
file before you start the demo deployment.
apiVersion: v1
kind: ConfigMap
metadata:
name: cp4i-overrides
namespace: pipeline-eventdrivendemo
type: kubernetes.io/basic-auth
data:
ibm-common-service-operator-storageClassName: ibmc-file-gold-gid
ibm-integration-platform-navigator-storageClassName: ibmc-file-gold-gid
ibm-eventstreams-storageClassName: ibmc-file-gold-gid
ibm-apiconnect-storageClassName: ibmc-block-gold
To choose the storage classes used by the demo deployment pipelines, edit the values identified by: find . -name 'pipelinerun*yaml' -print0 | sort -z | xargs -0 grep storageClassName
$ find . -name 'pipelinerun*yaml' -print0 | sort -z | xargs -0 grep storageClassName
./02-install-ibm-catalog/pipelinerun.yaml: storageClassName: ibmc-block-gold
./03-install-ibm-common-services/pipelinerun.yaml: storageClassName: ibmc-block-gold
./04-install-platform-navigator/pipelinerun.yaml: storageClassName: ibmc-block-gold
./05-install-event-streams/pipelinerun.yaml: storageClassName: ibmc-block-gold
./06-start-kafka-connectors/pipelinerun.yaml: storageClassName: ibmc-block-gold
./07-install-event-endpoint-management/pipelinerun-install.yaml: storageClassName: ibmc-block-gold
./07-install-event-endpoint-management/pipelinerun-setup.yaml: storageClassName: ibmc-block-gold
./08-publish-topics-to-eem/pipelinerun.yaml: storageClassName: ibmc-block-gold
./08-publish-topics-to-eem/pipelinerun.yaml: storageClassName: ibmc-block-gold
Step 6 - start the demo deployment
(hands-on-keyboard) time: |
40 seconds |
pre-requisites: |
oc , make |
You've got the demo automation, you've collected the credentials you need, and you've got an OpenShift cluster.
It's time to start the demo deployment.
- 📹 0s Navigate to the OpenShift console
- 📹 8s Login to the cluster using the oc CLI
- 📹 24s Run make all
video recording at youtu.be/l3H3Lbl7DCw
Step 7 - wait patiently for the demo deployment
(hands-on-keyboard) time: |
none! |
This is going to take a while.
Sorry - the demo setup script could really do with parallelising some steps! I went more with simplicity than efficiency. (Pull requests are welcome!)
The deployment script only outputs which step it is up to.
dalelane@Dales-MBP-2 event-endpoint-management-demo % make all
subscription.operators.coreos.com/openshift-pipelines-operator-rh created
namespace/pipeline-credentials created
secret/ibm-entitlement-key created
namespace/pipeline-eventdrivendemo created
Now using project "pipeline-eventdrivendemo" on server "https://c102-e.eu-de.containers.cloud.ibm.com:30823".
secret/github-credentials created
secret/alphavantage created
secret/twitter created
configmap/cp4i-overrides created
serviceaccount/pipeline-deployer-serviceaccount created
task.tekton.dev/copy-secret created
task.tekton.dev/create-cp4i-instance created
task.tekton.dev/create-namespace created
task.tekton.dev/create-resource created
task.tekton.dev/install-operator created
task.tekton.dev/wait-for-cp4i-operand created
task.tekton.dev/wait-for-operator created
task.tekton.dev/wait-for-pod created
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-ibmcatalog-role created
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-ibmcatalog-rolebinding created
pipeline.tekton.dev/pipeline-ibmcatalog created
------------------------------------------------------------
Installing the IBM Catalog into the cluster...
------------------------------------------------------------
pipelinerun.tekton.dev/install-ibm-catalog-jsk75
pipelinerun.tekton.dev/install-ibm-catalog-jsk75 condition met
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-commonservices-role created
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-commonservices-rolebinding created
pipeline.tekton.dev/pipeline-cp4i created
------------------------------------------------------------
Configuring IBM Common Services...
------------------------------------------------------------
pipelinerun.tekton.dev/install-ibm-common-services-f62pw
pipelinerun.tekton.dev/install-ibm-common-services-f62pw condition met
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-platformnavigator-role created
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-platformnavigator-rolebinding created
pipeline.tekton.dev/pipeline-cp4i configured
------------------------------------------------------------
Creating the Cloud Pak for Integration Platform Navigator...
------------------------------------------------------------
pipelinerun.tekton.dev/install-platform-navigator-wdmsg
pipelinerun.tekton.dev/install-platform-navigator-wdmsg condition met
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-eventstreams-role created
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-eventstreams-rolebinding created
pipeline.tekton.dev/pipeline-cp4i configured
------------------------------------------------------------
Creating the Event Streams instance...
------------------------------------------------------------
pipelinerun.tekton.dev/install-event-streams-kt9ws
pipelinerun.tekton.dev/install-event-streams-kt9ws condition met
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-kafkaconnect-role created
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-kafkaconnect-rolebinding created
task.tekton.dev/create-kafka-connectors-docker-image created
task.tekton.dev/create-kafka-credentials created
task.tekton.dev/maven created
configmap/pipeline-maven-settings created
pipeline.tekton.dev/pipeline-kafkaconnectors created
------------------------------------------------------------
Building and starting Kafka connectors...
------------------------------------------------------------
pipelinerun.tekton.dev/start-kafka-connectors-bx9pj
pipelinerun.tekton.dev/start-kafka-connectors-bx9pj condition met
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-eventendptmgmt-role created
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-eventendptmgmt-rolebinding created
pipeline.tekton.dev/pipeline-cp4i configured
------------------------------------------------------------
Creating the Event Endpoint Management instance...
------------------------------------------------------------
pipelinerun.tekton.dev/install-event-endpoint-management-tbvw4
pipelinerun.tekton.dev/install-event-endpoint-management-tbvw4 condition met
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-eventendptmgmt-role unchanged
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-eventendptmgmt-rolebinding unchanged
task.tekton.dev/apic-cli-login created
task.tekton.dev/get-apic-cli created
task.tekton.dev/setup-apic-admin-user created
task.tekton.dev/setup-apic-catalog created
task.tekton.dev/setup-apic-portal-user created
task.tekton.dev/setup-apic-portal created
task.tekton.dev/setup-apic-provider-org created
pipeline.tekton.dev/pipeline-event-endpoint-management created
------------------------------------------------------------
Setting up the Event Endpoint Management instance...
------------------------------------------------------------
pipelinerun.tekton.dev/setup-event-endpoint-management-zbznf
pipelinerun.tekton.dev/setup-event-endpoint-management-zbznf condition met
clusterrole.rbac.authorization.k8s.io/pipeline-deployer-stockpricesasyncapi-role created
clusterrolebinding.rbac.authorization.k8s.io/pipeline-deployer-stockpricesasyncapi-rolebinding created
task.tekton.dev/apic-cli-login configured
task.tekton.dev/apic-stockprices-asyncapi-publish created
task.tekton.dev/create-kafka-credentials configured
task.tekton.dev/generate-asyncapi created
task.tekton.dev/get-apic-cli configured
pipeline.tekton.dev/pipeline-asyncapi created
------------------------------------------------------------
Generating and publishing doc for connectors...
------------------------------------------------------------
pipelinerun.tekton.dev/publish-topics-to-eem-5pdjj
pipelinerun.tekton.dev/publish-topics-to-eem-5pdjj condition met
Install complete.
Cloud Pak for Integration : https://cpd-integration.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud
username : admin
password : fZG9JeSO8p1CzRZPifVxS1IOBoCyojRG
Event Streams : https://cpd-integration.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud/integration/kafka-clusters/eventstreams/es/
username : admin
password : fZG9JeSO8p1CzRZPifVxS1IOBoCyojRG
Event Endpoint Management : https://cpd-integration.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud/integration/apis/eventendpointmanagement/eem/manager
username : admin
password : fZG9JeSO8p1CzRZPifVxS1IOBoCyojRG
Developer Portal : https://eem-ptl-portal-web-eventendpointmanagement.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud/events-demo/events-catalog/
username : demouser
password : FDL53QdIMQwnEWCwRFkVlZXpysGTGX0L
If you go to the Pipelines page in the OpenShift console, you'll also see how far it has got.
You'll also have access to the detailed logs for each step.
Try to resist keep checking though.
You don't need to do anything in this step!
It won't go any faster if you keep watching it!
Step 8 - access the demo
The last thing the demo deployment script does is print out the URLs and username/password for each of the demo components.
Make a note of them.
If you lose them, you can generate this again by running make output_details
dalelane@Dales-MBP-2 event-endpoint-management-demo % make output_details
Install complete.
Cloud Pak for Integration : https://cpd-integration.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud
username : admin
password : fZG9JeSO8p1CzRZPifVxS1IOBoCyojRG
Event Streams : https://cpd-integration.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud/integration/kafka-clusters/eventstreams/es/
username : admin
password : fZG9JeSO8p1CzRZPifVxS1IOBoCyojRG
Event Endpoint Management : https://cpd-integration.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud/integration/apis/eventendpointmanagement/eem/manager
username : admin
password : fZG9JeSO8p1CzRZPifVxS1IOBoCyojRG
Developer Portal : https://eem-ptl-portal-web-eventendpointmanagement.itzroks-120000f8p4-5mlabz-6ccd7f378ae819553d37d5f2ee142bd6-0000.ams03.containers.appdomain.cloud/events-demo/events-catalog/
username : demouser
password : FDL53QdIMQwnEWCwRFkVlZXpysGTGX0L
If you want to get rid of all the resources associated with the deployment pipelines, you can run make clean
. (This doesn't remove any of the actual demo itself, just the automation used to deploy it)
Step 9 - try it out
Use the URLs and usernames/passwords in the output to access some of the UIs deployed by the demo.
The following videos give you an idea of what you can expect to see. (These ones do include audio, sorry!)
Event Streams
This is the UI that the owner of the Kafka cluster would use. You can see all of the topics that the demo comes with:
- five topics with stock price change events for different tech companies
- These topics will get a new event roughly once every minute. Each message contains the stock price at that time. They're a nice example of a topic with a lot of historical data.
- a topic with events from twitter
- This topic gets a new message every time someone posts a tweet mentioning the word "kafka". Each message contains everything about that tweet - what they said, who posted it, and so on. You can try posting the word "Kafka" to twitter yourself and see new events appear on the topic.
- two topics with flight events - one for when planes land, one for when planes take off
- These are generated events about a fictional airport. This helps to guarantee that you'll always get new events every few seconds (unlike the twitter topic for example, where if no-one mentions kafka you might not get any new messages for a while, or the stock price topics which won't get any new messages when stocks aren't traded, such as during weekends)
- a topic with random values
- Messages are posted to this topic roughly every second. Each message is a JSON payload with a random letter and a random integer.
video recording at youtu.be/5YPYKQHmc60
Event Endpoint Management (API Manager)
This is the UI that someone would use to document and publish their Kafka topics.
An "Events demo" catalog is pre-created for you, with an Event Gateway and Developer Portal set up and ready for use.
All of the demo Event Streams topics are documented in the API catalog using AsyncAPI.
The Event Gateway is configured for use with the topics, using custom Kafka credentials and SSL certificates generated for each topic.
video recording at youtu.be/5YPYKQHmc60
Event Endpoint Management (Developer Portal)
This is the UI that someone would use to discover the Kafka topics that are available within their organisation for them to use in their applications.
All of the demo Event Streams topics are published in the Developer Portal, with documentation describing where the data comes from and the schema of the messages.
The Portal helps developers to use the topics they discover, with generated code for creating new applications, and configuration properties for existing applications.
video recording at youtu.be/5YPYKQHmc60
Step 10 - try creating a client app
You can try using the Developer Portal to create your own client application.
The Developer Portal provides a self-service way to create credentials for your application. You just need to log on first, using the credentials in the demo setup output.
There are some resources with the demo to help you do this. This is particularly useful if you've not written a Kafka application before.
video recording at youtu.be/5YPYKQHmc60
Give it a try!
This is a long post, but I hope this doesn't put you off giving it a try.
You just need to create a few credentials, and then run a single command. And I think trying out an instance of Event Endpoint Management that is populated with real live Kafka topics with real messages on them is the best way to get you thinking about how you could use it for real.
#eventendpointmanagement