Community
Search Options
Search Options
Log in
Skip to main content (Press Enter).
Sign in
Skip auxiliary navigation (Press Enter).
Community
Data Lifecycle - Integration and Governance
Topic groups
Cloud Pak for Data
Data Governance - Knowledge Catalog
Data Integration
Global DataOps
Governance, Risk, and Compliance (GRC)
InfoSphere Optim
Master Data Management
Groups
AI
Automation
Data
Security
Sustainability
Cloud
IBM Z & LinuxONE
Power
Storage
IBM Champions
IBM Japan
All Groups
Champions
User groups
All user groups
Events
IBM TechXchange Conference
Upcoming Data Lifecycle - Integration and Governance events
IBM TechXchange Webinars
All IBM TechXchange Community Events
Participate
Welcome Corner
Blogging in the Community
Directory
Community Leaders
Resources
Gamification
Marketplace
Marketplace
IBM Data Lifecycle - Integration and Governance
Connect with experts and peers to elevate technical expertise, solve problems and share insights.
IBM TechXchange 2025 conference is accepting
Session proposals through April 11
Submit your proposal
Skip main navigation (Press Enter).
Toggle navigation
Search Options
Analytics Microservice Architecture with IBM Streams
View Only
Group Home
Threads
2.1K
Library
319
Blogs
259
Events
0
Members
4.2K
Analytics Microservice Architecture with IBM Streams
0
Like
Thu July 30, 2020 07:55 PM
NATASHA D'SILVA
This article was written by Daniel Debrunner.
Microservice architecture
MartinFowler.com said: "
The term "Microservice Architecture" has sprung up over the last few years to describe a particular way of designing software applications as suites of independently deployable services.
" See the full article
here
.
IBM Streams has supported this style since its inception and customers have taken advantage of it through dynamic connections, or import/export. A dynamic connection is a stream connected between running applications (jobs). One approach has been to have a number of ingest applications to bring data into Streams and a number of egress applications to send analytic results to external systems such as databases, dashboards, text messages etc. Connecting these ingest and egress applications are analytic applications that dynamically connect to one or more streams provided by ingest applications, analyze the data in real time and then produce streams that are in turn consumed by one or more egress applications. This allows analysts to develop their applications without having to be concerned about connectivity details such as database location or authentication tokens.
Thus a complete end to end system has been split into a number of smaller applications that communicate through streams, a microservice architecture, where each microservice imports streams as inputs and exports streams as outputs. Analysts quickly found out that they could gain value from also exporting streams containing their results or even
intermediate results
for other analysts to import and analyze further. For example, Jayne is analyzing GPS data for a fleet of buses to provide alerts when a bus is idle for more than 15 minutes in an unexpected location (such as the side of the road, rather than a terminus). Jayne exports not only her alert stream (so that a text message application imports it) but also intermediate calculations such as a stream containing all idle events for buses regardless of position. Thomas wants to provide real-time fleet efficiency ratings so he imports Jayne's idle events stream, correlates it with an imported stream containing weather data as well as other data sources to provide the analytics he requires.
IBM Streams dynamic connections
Dynamic connections (import/export) are a powerful mechanism that enable a microservice architecture. Applications dynamically connect to each other through streams without creating any dependencies on each other. For example an application that produces bus GPS data can be running 24/7 and does not care how many applications are consuming its stream, it could be one, several or even none. Dynamic connections are a many to many concept, so any number of applications can produce a stream with the same characteristics and any number of consuming applications can be importing an exported stream. Furthermore, the Streams security model allows you to control which applications can consume from an exported stream and which applications can produce data for an imported stream. In addition
dynamic
has an additional meaning where:
each producer can modify a streams characteristics to reflect a change in the data.
each consumer can modify its subscription to change what data it wants to consume.
These changes can result in connections being removed or added, changing the flow graph and rewiring the application dynamically based on whatever conditions are appropriate.
Publish-subscribe
Publish-subscribe is an easy to use simplification of the IBM Streams import/export model. Publish-subscribe provides this functionality:
Streams are published to a topic (e.g. 'bus/locations') and subscribed to through a pattern, e.g. 'bus/locations' to match a single topic, or 'bus/+' to match any topic directly under 'bus/', e.g. 'bus/locations' and 'bus/alerts'.
Publisher and subscriber are fully independent, for example allowing a publisher to be publishing from a parallel region while the subscriber is in a non-parallel region or a parallel region with a different width. Any subscriber always receives the correct data regardless of how the stream is published.
Publish-subscribe supports stream interchange between any application implementation language, for example an ingress SPL application may publish a stream that is then subscribed to by Python and SPL applications, or a Python application publishes a stream subscribed to by Scala and Java applications.
A
follow-up article will cover publish-subscribe for IBM Streams
in more detail. Publish-subscribe is provided by the
com.ibm.streamsx.topology
toolkit for IBM Streams 4.0.1 or later.
#CloudPakforDataGroup
#microservices
#streaming-analytics
#Streams
Statistics
0 Favorited
20 Views
0 Files
0 Shares
0 Downloads
Community
Data Lifecycle - Integration and Governance
Topic groups
Cloud Pak for Data
Data Governance - Knowledge Catalog
Data Integration
Global DataOps
Governance, Risk, and Compliance (GRC)
InfoSphere Optim
Master Data Management
Groups
AI
Automation
Data
Security
Sustainability
Cloud
IBM Z & LinuxONE
Power
Storage
IBM Champions
IBM Japan
All Groups
Champions
User groups
All user groups
Events
IBM TechXchange Conference
Upcoming Data Lifecycle - Integration and Governance events
IBM TechXchange Webinars
All IBM TechXchange Community Events
Participate
Welcome Corner
Blogging in the Community
Directory
Community Leaders
Resources
Gamification
Marketplace
Marketplace
Powered by Higher Logic