Get started with using Streams flows by following this roadmap

 View Only

Get started with using Streams flows by following this roadmap 

Thu October 08, 2020 09:54 PM

Streams Flows is a web based IDE for quickly creating streaming analytics applications in IBM Cloud Pak for Data. The applications are created and deployed in a browser.

This post is collection of videos and articles to introduce you to the the canvas and show you how to create your own applications with Streams flows.

Table of Contents


Why should you use Streams Flows? This video provides an overview as well as an introduction to the Streams Flows canvas.

Set up the services

Now that your setup is complete, a great way to try out Streams Flows is by running a example application.

Running an example flow

Applications created in Streams Flows are called flows. Run this example flow to ingest data from simulated weather stations and use the Aggregation operator in Streams Flows to compute statistics like average temperature and humidity.

Monitor the running flow

The next tutorial in the series demonstrates how you can monitor a running application using the metrics page. You can observe the application’s performance, see the data as it moves between operators and download application logs.

Learn how to monitor a running flow

Create your own application

After running an example flow and learning how to interact with a running flow, you’re now ready to create your own applications with the canvas.

Create a Streams flow with the canvas

Ingesting data

The first step in creating your own flow is connecting to a data source. You can ingest data from MQTT, HTTP servers, Watson IoT Platform, as well as Apache Kafka and IBM Event Streams.

Ingest data from IBM Event Streams as a data source


This tutorial shows you how to ingest data from IBM Event Streams, which is IBM’s Apache Kafka offering. This tutorial shows how to
1) Send data to Event Streams using a Python notebook
2) Ingest that data from a Streams flow

Try the tutorial

Use data from IoT devices with the WatsonIoT operator

Another common data source is data from Internet of Things (IoT) devices. These are ingested in Streams Flows using the WatsonIoT operator.
Watch this video to learn how to use it, and then download the complete application from GitHub

Analyzing data

This section covers how to use some of the features in Streams flows to analyze streaming data.

Computing moving averages and running totals with the Aggregation operator

You may have noticed in the example flow that the Aggregation operator was used to compute general statistics like averages, max/min, totals, and so on. Learn more about the Aggregation operator and how to use it in this post.

Add custom Python code using the Code operator

Your application might require customized logic for tuple processing, or you might want to connect to a different database that isn’t currently supported as a source or target operator, such as Cassandra.
This video shows how to do so with the Python operator.

Code operator documentation

Score a model from Watson Machine Learning in your Flow

Extract more insights from your streaming data by adding machine learning using the WML Model operator. You can train a model on existing data, upload it to the Watson Machine Learning service, and then apply that model to your streaming data by deploying it in your flow.
Follow this tutorial for more information.

Score streaming data in real-time using R models and Streams Flows

You can create a forecasting microservice to score an R model on data from a Streams flow.

Download sample code from GitHub.

Useful Links



0 Favorited
0 Files