Cloud Pak for Data

Streaming Analytics for Customer Life Event Prediction 

18 days ago

When you can see the future, you can plan ahead for the financial wellness of your client by reaching out with the right offer at the right time.

Use streaming data about your customers to generate timely, actionable alerts.

What's included?

  • A structured business glossary of more than 100 business terms.
  • Sample data science assets, including:
    • Self-documenting Jupyter notebooks
    • R Shiny application dashboard

How does it work?

The glossary provides the information architecture that you need to predict major life events, such as buying a home or relocating. And your data scientists can use the sample notebooks, predictive models, and dashboards to accelerate data preparation, machine learning modeling, and data reporting.

This accelerator includes a tutorial that shows you how to analyze streaming data about customer life events. The tutorial shows you how to:

  • Integrate the event scoring model with a streaming data source to score events as they occur.
  • Write the scores to a database for offline analysis.
  • Develop logic to determine which scores are significant.
  • Push events with significant scores to a Kafka event store to generate actionable alerts.
  • Create an R Shiny dashboard to display recent significant scores.

Set your clients on the path to financial success by engaging with them at the right time with relevant options.

When you import the accelerator:

  • The terms are added to your business glossary under the Customer Life Event Prediction category.
  • The data science assets are added to a new analytics project.
Data from the model displayed in the sample R Shiny dashboard app.


To use this accelerator, you must:

Importing the accelerator

To use this accelerator, complete the following steps:

  1. Download the streaming-customer-life-event-prediction.tar.gz file, which is available on the repository.
  2. Determine how you want to import the accelerator:
    • If you want to pick which components you import, complete the following steps:
      1. Extract the contents of the package.
      2. For the buisness glossary, use the Cloud Pak for Data web client to import the XML file. For details, see Importing a data dictionary in the Cloud Pak for Data documentation.
      3. For the data science assets, use the Cloud Pak for Data web client to import the ZIP file. For details, see Creating a project in the Cloud Pak for Data documentation.
    • If you want to install all of the components, complete the following steps:
      1. From a command prompt, run the following command to authenticate to Cloud Pak for Data:

        curl -s --output /dev/null -w '%{http_code}' -k -X POST https://{HOSTNAME}:{PORT_NUMBER}/v1/preauth/signin -F username={USERNAME} -F password={PASSWORD} -c auth_cookie.txt

      2. Run the following command to import the TAR.GZ file to Cloud Pak for Data:

        curl -k -X POST https://{HOSTNAME}:{PORT_NUMBER}/zen-watchdog/v1/accelerator -F project_file=@{FILEPATH} -b auth_cookie.txt

        For the {FILEPATH}, use the fully qualified path of the TAR.GZ file that you downloaded.

      3. Run the following command to verify that the import completed sucessfully:

        curl -k -b auth_cookie.txt -X GET https://{HOSTNAME}:{PORT_NUMBER}/zen-watchdog/v1/accelerator/status/{REQUEST_ID}

        For the {REQUEST_ID}, use the ID that was returned by the preceding command.

    Additional information

    This accelerator has been verified on Cloud Pak for Data v2.1.0.2

    About the developer:



    This project contains Sample Materials, provided under license.
    Licensed Materials - Property of IBM.
    © Copyright IBM Corp. 2019. All Rights Reserved.
    US Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.

0 Favorited
0 Files

Tags and Keywords

Related Entries and Links

No Related Resource entered.