Data Integration

 View Only

This online community is intended for DataStage, Data Replication and Data Integration users (including Information Server, QualityStage, FastTrack, and Information Services Director) to get advice from their industry peers, communicate with IBM experts on best practices, and stay up to date with IBM regarding product enhancements, user group meetings, webinars, how-to blogs and new helpful materials. Join the conversation!
 
The first post of all new members requires admin approval - we check regularly, so if you don't see your comment or question at first, we'll get to it as soon as we can 🙂 

New to IBM Communities? Set up or link your IBM ID by logging in, and joining the groups that spark your interest!
Interested specifically in Data Replication? Join the group here!

Additional resources:
  • If you have questions or need support on IBM Data Integration products, check out our Support page where you can Chat with Support, open a case, or search all technical documentation from one convenient location.
  • Developers can ask technical questions on https://stackoverflow.com using tag "ibm-datastage".
  • IBM Cloud Pak for Data badges help DataStage users validate their cloud and data transformation expertise on IBM's Data and AI platform that modernizes how businesses collect, organize and analyze data to infuse AI throughout their organizations. From data management, DataOps, governance, business analytics, and automated AI, IBM Cloud Pak for Data helps eliminate the need for costly, and often competing, point solutions while providing the information architecture you need to implement AI successfully. Check out IBM Cloud Pak for Data V3.0.x Essentials and IBM Cloud Pak for Data V3.0.x Data Access and Transformation.

Latest Discussions

  • Hi For an ODBC connection, each database requires a distinct entry in the `odbc.ini` file. Consequently, I suggest using a JDBC connection instead. With JDBC, you can iterate through a list of connections and dynamically pass the connection ...

  • Hi Ricardo, Udos suggestion is good, I did something like this in a past project, being able to run 1,2,4,8,16 times in parallel if needed. If you really want to have all 300 instances of the job to run in parallel (and the size of your installation ...

    1 person likes this.
  • Hello Ricardo, If the extraction and and transformations are the same, may be you can work with a dataset between the Stages Transformer and Load. You should save time. For the load, if you run the job through DataStage you will have to wait the end. ...

Latest Blogs

  • Every day, there seems to be a new integration use case for data observability. Whether it’s a shiny, modern data pipeline tool or that legacy workhorse that’s never going away, data observability spans both data stacks. With that in mind, we’re excited ...

  • IBM Cloud Pak for Data has repeatedly served as a hub for all data and analytics tools and a cornerstone in companies’ journeys towards digital transformation, helping users implement a data fabric architecture and achieve success through data-driven ...

    1 person likes this.
  • Introduction In recent years the question of whether to follow an ETL (extract, transform, load) or an ELT (extract, load, transform) data integration approach has gained increased attention since in todays hybrid cloud world data ...

  • In today’s dynamic business environment, organizations require high availability of data to make agile, savvy decisions. They cannot afford to compromise the availability of their data. Indeed, Forrester has estimated that 50% of all enterprises could ...

  • In recent years, the data integration market has undergone substantial evolution, a process that continues today. Users are now seeking to fulfill diverse objectives through their array of data integration tools. These aims include driving operational ...

    2 people like this.

Latest Files

Community Members
859 Members
group Admin
group Moderator
group Admin
group Moderator
group Admin
group Moderator
group Admin
group Admin
group Admin
group Moderator