Global Data Lifecycle - Integration and Governance

Global Data Lifecycle - Integration and Governance

Connect with Db2, Informix, Netezza, open source, and other data experts to gain value from your data, share insights, and solve problems.

 View Only

IBM TechXchange Conference 2023 Data Track - Labs for Data Lifecycle Management

By NICK PLOWDEN posted Fri July 07, 2023 11:49 AM

  

IBM TechXchange Conference 2023 Data Track - Labs for Data Lifecycle Management

We know that you’ve been eagerly awaiting on more details about the technical sessions and labs planned for the upcoming IBM TechXchange Conference 2023, happening in Las Vegas September 11-14. Here is a preview of the labs being planned for Data Lifecycle Management  This is only a small set of the 1000+ sessions and labs that will offer you the opportunity to increase your technical knowledge and capabilities. Please note that the titles and lab descriptions are still being refined, but this gives you a sense of what is coming. 

Register today for the first TechXchange Conference for technologists using IBM products and solutions. Also, save $300 with early bird pricing if you register before July 21st. See you there!

Here is the list of labs for the Data Lifecycle Management portfolio in the Data Track

Title Abstract
Building and scaling a resilient cloud native integration with IBM App Connect Enterprise and IBM MQ During this lab you will deploy, scale, and upgrade a resilient cloud native integration using IBM App Connect Enterprise and IBM MQ on Red Hat Openshift. You will start by deploying the solution using Openshift’s built-in pipeline technology called Tekton and see how each component is resilient to failure. An application upgrade will be rolled out with zero downtime and no affect to the end users experience. Finally, MQ and App Connect Enterprise will be scaled horizontally, and the traffic automatically balanced to handle an increase in traffic. 
GraphQL Zero to Enterprise Lab using StepZen and API Connect Multiple companies around the world are looking for a way to build, secure and scale GraphQL APIs. Are you ready to support them? Join us to learn how to move from Zero to Enterprise in the GraphQL domain. Initially you will learn how to use StepZen to create a a federated GraphQL server by pulling data from disparate sources (DB, REST, etc). After that you will use IBM API Connect to secure and manage the lifecycle of this federated GraphQL API. Come and learn how StepZen and IBM API Connect can help your clients in their API Management journey.
Introduction on Using API Connect and DataPower to Secure Your APIs In this lab session, you will learn how to use user-defined policies to enhance the security of APIs hosted by IBM API Connect using DataPower. You will learn how to configure DataPower to apply custom security policies to APIs, including implementing advanced security features such as content-based access control, rate limiting, and threat protection. By the end of the lab, you will have a deep understanding of how to leverage DataPower's advanced security capabilities to protect your APIs from various threats.
Modernize your integrations with cloud-native style deployment using  App Connect Enterprise on CP4I In this lab you will learn how to evaluate the container readiness of your existing IBM Integration Bus resources running in an on-premises VM environment using ACE Transformation Advisor. Your existing integration topology may be using the old style configuration such as MQ Server binding or configurable services which now need to be converted to newer configuration objects using policy projects. We will walk you through the steps to migrate your Integration flows to IBM Cloud Pack for Integration (CP4I) with refactoring where necessary by utilizing the recommendations from ACE Transformation Advisor. You will also learn how to scale your applications in containers using replicasets and autoscaling policies.
Build, share, and reuse custom connectors for your business leveraging the Connector Development Kit Connectors play a key role in integrating applications, building APIs and acting on events. Without connectors, users lose the versatility to quickly connect diverse types of systems, the standardization to ensure reliable consistency, the ability to scale integrations on demand, and the time spent on increased maintenance and governance. In this session, we will deep dive on how to build your own custom connectors for easy reuse across your business. 
Managing event endpoints Event Endpoint Management lets you describe socialise and manage your Apache Kafka topics just like you manage APIs. 

This lab walks through sharing and then consuming your first event endpoint.  We will start by exploring the sharing experience by setting up an event gateway, describing existing topics, and publishing them to a searchable catalog.  We will also go through the consumption experience, finding, exploring the Async API definition and generating self-serve credentials to start using these events.
Bridging MQ and Kafka IBM MQ and IBM Event Streams are a great complement to each other. Event Streams provides event distribution and streaming, and MQ messages are a valuable source of real-time events, representing the transactions, changes, and interactions that are occurring in the business.

Flowing MQ messages into Event Streams is a simple way to let them in additional ways. Emit notifications about events in real-time, enabling new and responsive applications. Perform real-time analytics and analysis on events as they're emitted, or auditing on a historical event log of previous messages.

In this Lab, you set up a fast and reliable connection between MQ and Kafka without disrupting existing MQ apps or queues, and configure this connection to transform and reformat the messages in different ways.
Event Processing made easy IBM Event Automation introduces an exciting new capability for unlocking the value of events: Event Processing.

In this Lab, you will get hands-on with this new technology. We will set up a variety of Kafka topics with streams of events flowing to them, and show you how to use the low-code authoring canvas to define event processing flows.

You’ll run your event processing flows and see the results directly in the low-code canvas. Then you’ll export the generated SQL and try running that yourself as a production job in Apache Flink.

The Lab will show you how easy it is to start processing events on your Kafka topics.
Zero to 100 - All You Need to Know -Getting Started with IBM EventStreams  Are you new to Kafka? Still trying to figure out the basics of topics, schema etc.? This is a good chance to polish your knowledge with all the basic components of EventStreams – The Event Distribution Pillar of IBM Event Automation. You will learn about Kafka, replication with Geo-Rep / MirrorMaker 2, Schema Registry, Connector Framework and Integration with Instana. 
Create your own web application using Aspera NodeAPI It is possible to connect and trigger Aspera transfer using Aspera NodeAPIs. This session will teach you how to build your own web application using NodeAPI and trigger transfers from a browser. 
Horizontally scaling IBM MQ and your applications In this session we will cover the different types of Availability - message availability and service availability. In this lab you will use IBM MQ Native HA and Uniform Clusters to create a highly resilient, always-on IBM MQ solution.
IBM Cloud Pak for Integration in Action  - Turbocharge  your Application  Integration Development  Experience first hand  an app  developer's view of Cloud Pak for Integration . You will  work with a single application and implement the following using the low code/no code capabilities of the IBM Cloud Pak for Integration components.

- Create, deploy and test a new, external API using the IBM API Connect Developer Toolkit
- Bidirectionally sync Salesforce data with the  application using IBM App Connect
- Implement near realtime transactional  data replication from the application to reporting databases using IBM Event Streams (IBM's Kafka offering) 
Developing and testing an IBM App Connect application  This session is designed for integration experts and business technologists who are curious about how IBM App Connect can be used by businesses building API-driven and event-driven integration architectures.  IBM App Connect includes artificial intelligence (AI) and other automation features to speed time to value and reduce risk of longer project timelines.

The hands-on lab portion of the session will have you create a simple message flow application and use the IBM App Connect Flow Exerciser to test it. The message flow uses HTTP nodes and acts as a simple web service.  Finally, you use the IBM App Connect web user interface to check the status of the integration server and message flow application.
Unleash Your Data's Potential: Master Data Quality and Discovery with Cloud Pak for Data Data has become a crucial asset for modern businesses, however, the sheer volume and complexity of data can make it difficult to manage, organize, and maintain its quality. Data quality is essential for making accurate decisions and driving business success. Poor quality data can lead to incorrect insights, which can result in costly mistakes and lost opportunities.

In this hands-on lab, participants will learn how to leverage CP4D to improve data quality and accelerate data discovery, how to enrich metadata and tag data assets, and how to run data quality checks, identify data anomalies, and cleanse data.

By attending this lab, participants will have the know how to transform their organization into a data-driven powerhouse and gain a competitive edge in the market. 
0 comments
18 views

Permalink