1 to 10 of 22
Sort by

Library Entry
Video v3.5: DataStage Data Transformation

DataStage in Cloud Pak for Data is a data integration offering for designing and running data flows that move and transform data. Create a simple parallel job This videos shows you how to create a simple data transformation job in IBM Cloud Pak for Data. Create a job from a template -...

Library Entry
Videos v3.5: Watson Studio Streams Flows

You can use Streams Flows to develop streaming applications called flows and deploy them to IBM Cloud Pak for Data. Flows are created by using the simple drag-and-drop interface to ingest and analyze streaming data. Streams Flows Overview This video provides an overview of Watson Studio...

Blog Entry
Analyze streaming data with Python and Cloud Pak for Data

Imagine you are a developer for an energy company and you would like to continuously ingest and analyze data from thousands of sensors in the field. You would like to compute some statistics for each sensor, such as the rolling average, and then store that data or further analyze it by applying...

NATASHA D'SILVA's profile image

Library Entry
Get started with using Streams flows by following this roadmap

Streams Flows is a web based IDE for quickly creating streaming analytics applications in IBM Cloud Pak for Data. The applications are created and deployed in a browser. This post is collection of videos and articles to introduce you to the the canvas and show you how to create your own...

Library Entry
Optional Data Types in the SPL Programming Language

Streams 4.3 introduced a new type to the SPL programming language to better allow Streams applications to interoperate with external data sources such as databases and JSON data. Previously, there was no straightforward way for SPL developers to handle tuple attributes that had no value. For...

Library Entry
Running Jobs and PEs with Special Credentials and OS Capabilities

In Streams 4.x, there are a couple of security enhancements relating to Streams jobs and their corresponding PE processes. The first enhancement is the ability to have all jobs within a Streams instance run as a configured user, rather than the default of running the jobs under the user that...

Library Entry
New to Streams? Start Here

If you're new to Streams, this page has all you need to get started. What is Streams? Watch a video overview of Streams and some of its features. How do I get started with Streams? Get started with Python, SPL, or Streams Flows. Find a sample Search for sample applications...

Library Entry
Passing parameters to an operator in a parallel region

The documentation for User-Defined Parallelism presents an example of using a FileSource inside of a UDP region: @parallel(width=3) stream<uint64 i> Output = FileSource() ( param file: "input" + (rstring)getChannel() + ".csv"; ) If the above FileSource invocation...

Library Entry
Score PMML models in real-time with Python and IBM Streams

This video shows how to use the streamsx.pmml Python package to perform scoring with a PMML model and viewing the results. The Streams application is created in a Python notebook running in IBM Cloud Pak for Data, but this API can be used on a local Streams installation or the Streaming...

Library Entry
How to use local files in a Streams notebook in Cloud Pak for Data

The Streams Python development guide mentions how you can quickly read local files from a Python application . However, what if the file is in a project in Cloud Pak for Data? How can you access files in a project from a Streams notebook? In this post I’ll cover how you can use data from a...