Data Integration

 View Only
  • 1.  Change connections dynamically in Jobs with odbc connector

    Posted 17 days ago

    Hello everyone,
    I have some questions about how to use dynamic connectios in odbc connector if its possible and how on a parallel or secuential job in datastage.

    I have 1 parallel job that only insert and update rows in one table (passtrough)


    This parallel job in his structure is 
    one odbc connector as source (this source always is the same) and one target odbc connector, but i have 300 target with the same table but different connection.

    I dont want to duplicate the same job up to 300 jobs
    Also i dont want to add up to 300 odbc connector in 1 only one job.


    I want to know if its possible to use only one job and dynamically change the target connection and how to.
    In this scenario with just 1 job I can send the same information to 300 targets.

    Regards.



    ------------------------------
    Ricardo Lopez Pacheco
    ------------------------------


  • 2.  RE: Change connections dynamically in Jobs with odbc connector

    IBM Champion
    Posted 15 days ago

    Hi Ricardo,

    I have not worked much with ODBC, but you should be able to parameterize Data source. Then have a parameter DATA_SOURCE in your job that you use in the Data source field of your connector. Build a DataStage sequence for you with a list loop, itterating your 300 Data Sources and calling the DataStage job in each itteration with the value form your list.



    ------------------------------
    Ralf Martin
    Principal Consultant
    infologistix GmbH
    Bregenz
    ------------------------------



  • 3.  RE: Change connections dynamically in Jobs with odbc connector

    Posted 15 days ago

    Hi Ricardo, Ralf,

    for ODBC Connection you need for each Database a ODBC-Connection entry in the odbc.ini.

    Therefore I recommend a jdbc-Connection. Then you can loop through a list of connections and pass the connection string to the job. But jdbc connector has a much higher memory footprint as any other connection type.

    Regards,

    Udo



    ------------------------------
    Udo Neumann
    ------------------------------



  • 4.  RE: Change connections dynamically in Jobs with odbc connector

    IBM Champion
    Posted 15 days ago
    Edited by Gerhard Paulus 15 days ago

    Hi Ricardo,
    Udo and Ralf,

    I don't recommend jdbc-Connections if there are any other Connectors (written in C) possible like Database-native Connectors (i.e. Db2 / Oracle) or ODBC. In case ODBC every data source has to be configured in odbc.ini as Udo wrote.

    As Ralf has already wrote, you can parameterize the ODBC connection. The best way to do this is with parameter sets. When you start the job, you then simply select the corresponding value file of the parameter set for the database against which the job should run with this run.

    When you implemented this job with ParameterSets then you are able (if you want) to embed it into a parent sequence with some loop iterations as Ralf described in his first reply.

    Regards,
    Gerhard



    ------------------------------
    Gerhard Paulus
    ------------------------------



  • 5.  RE: Change connections dynamically in Jobs with odbc connector

    Posted 3 days ago

    Hello Everyone

    Thank you ,Udo, Ralf and Gerhard for the reply.

    Ralf/Gerhard, I already have my parameter set with all connections.

    But ¿how can I run these jobs with loop interactions? without the execution being serial, because the way I do it that's how it happens and what I want is for the only job to be executed interacting between the 300 source data at the same time and not wait to release previous executions to finish so that start the new execution with the connection parameter.

    Regards



    ------------------------------
    Ricardo Lopez Pacheco
    ------------------------------



  • 6.  RE: Change connections dynamically in Jobs with odbc connector

    Posted yesterday

    Hi Ricardo,

    based on the Parameter Files for each connection, stored in the directory "/opt/IBM/InformationServer/Server/Projects/<DS_PROJECT>/ParameterSets/<Parameterset Name>, I recommend creating a list of these files with different delimiter for the entries and groups. That could be done by a script outside Datastage.

    E.g. for the execution of 4 jobs in parallel:

    TIENDA_4102,TIENDA_4103,TIENDA_4104,TIENDA_4107;TIENDA_4102,TIENDA_4102,TIENDA_4102,TIENDA_4102;....

    In the sequence you could loop through the list with delimiter ";" and then splitt the parameter delimited by "," with field function.

    Similar to:

    Not nice but it works,

    Or you did it completely outside Datastage and execute the jobs from a script.

    Regards,
    Udo



    ------------------------------
    Udo Neumann
    ------------------------------



  • 7.  RE: Change connections dynamically in Jobs with odbc connector

    Posted yesterday

    Hello Ricardo,

    If the extraction and  and transformations are the same, may be you can work with a dataset between the Stages Transformer and Load. You should save time.
    For the load, if you run the job through DataStage you will have to wait the end. So executions will be serial.
    To start the same Parallel Job simultaneously, you call it several times in the Sequencer Job or you work in command line.

    Regards



    ------------------------------
    Frederic Bargues
    ------------------------------



  • 8.  RE: Change connections dynamically in Jobs with odbc connector

    IBM Champion
    Posted yesterday

    Hi Ricardo,

    Udos suggestion is good, I did something like this in a past project, being able to run 1,2,4,8,16 times in parallel if needed. If you really want to have all 300 instances of the job to run in parallel (and the size of your installation supports this), then your either have to build this manually in your sequence (i.e. you have the job activity pointing to your job 300 times on the canvas) or you build a loop and run a execute command activity which then calls dsjob followed by the Unix & to run in the background, but then you will have no easy control about your jobs being successfull or not. I would rather go for Udos aproach or build something outside of DataStage in an external scheduling tool (TWS, CTRL-M ...).



    ------------------------------
    Ralf Martin
    Principal Consultant
    infologistix GmbH
    Bregenz
    ------------------------------



  • 9.  RE: Change connections dynamically in Jobs with odbc connector

    Posted 23 hours ago
    Hi
     
    For an ODBC connection, each database requires a distinct entry in the `odbc.ini` file. Consequently, I suggest using a JDBC connection instead. With JDBC, you can iterate through a list of connections and dynamically pass the connection string to the job. However, please note that the JDBC connector has a significantly higher memory footprint compared to other connection types.


    ------------------------------
    Zohaib Arshad
    ------------------------------