based on the Parameter Files for each connection, stored in the directory "/opt/IBM/InformationServer/Server/Projects/<DS_PROJECT>/ParameterSets/<Parameterset Name>, I recommend creating a list of these files with different delimiter for the entries and groups. That could be done by a script outside Datastage.
E.g. for the execution of 4 jobs in parallel:
In the sequence you could loop through the list with delimiter ";" and then splitt the parameter delimited by "," with field function.
Or you did it completely outside Datastage and execute the jobs from a script.
Original Message:
Sent: Fri June 28, 2024 01:33 PM
From: Ricardo Lopez Pacheco
Subject: Change connections dynamically in Jobs with odbc connector
Hello Everyone
Thank you ,Udo, Ralf and Gerhard for the reply.
Ralf/Gerhard, I already have my parameter set with all connections.
But ¿how can I run these jobs with loop interactions? without the execution being serial, because the way I do it that's how it happens and what I want is for the only job to be executed interacting between the 300 source data at the same time and not wait to release previous executions to finish so that start the new execution with the connection parameter.
Regards
------------------------------
Ricardo Lopez Pacheco
Original Message:
Sent: Mon June 17, 2024 05:08 AM
From: Gerhard Paulus
Subject: Change connections dynamically in Jobs with odbc connector
Hi Ricardo,
Udo and Ralf,
I don't recommend jdbc-Connections if there are any other Connectors (written in C) possible like Database-native Connectors (i.e. Db2 / Oracle) or ODBC. In case ODBC every data source has to be configured in odbc.ini as Udo wrote.
As Ralf has already wrote, you can parameterize the ODBC connection. The best way to do this is with parameter sets. When you start the job, you then simply select the corresponding value file of the parameter set for the database against which the job should run with this run.
![](https://dw1.s81c.com//IMWUC/MessageImages/06b25fb58a3e43d3833aaa01062abe5b.png)
![](https://dw1.s81c.com//IMWUC/MessageImages/80e4ab84ee484ad5b4e9e0b557b42b7d.png)
![](https://dw1.s81c.com//IMWUC/MessageImages/a0127dc03aed499e9d18f6644b2fa8de.png)
![](https://dw1.s81c.com//IMWUC/MessageImages/ded5e91774c444569f65a723c331c6b8.png)
![](https://dw1.s81c.com//IMWUC/MessageImages/6d08b33e3bae4f6dbd5d2cfa05f8a062.png)
![](https://dw1.s81c.com//IMWUC/MessageImages/98f612623bd941efba1daef374ded2a8.png)
When you implemented this job with ParameterSets then you are able (if you want) to embed it into a parent sequence with some loop iterations as Ralf described in his first reply.
Regards,
Gerhard
------------------------------
Gerhard Paulus
Original Message:
Sent: Mon June 17, 2024 02:06 AM
From: Udo Neumann
Subject: Change connections dynamically in Jobs with odbc connector
Hi Ricardo, Ralf,
for ODBC Connection you need for each Database a ODBC-Connection entry in the odbc.ini.
Therefore I recommend a jdbc-Connection. Then you can loop through a list of connections and pass the connection string to the job. But jdbc connector has a much higher memory footprint as any other connection type.
Regards,
Udo
------------------------------
Udo Neumann
Original Message:
Sent: Mon June 17, 2024 01:06 AM
From: Ralf Martin
Subject: Change connections dynamically in Jobs with odbc connector
Hi Ricardo,
I have not worked much with ODBC, but you should be able to parameterize Data source. Then have a parameter DATA_SOURCE in your job that you use in the Data source field of your connector. Build a DataStage sequence for you with a list loop, itterating your 300 Data Sources and calling the DataStage job in each itteration with the value form your list.
------------------------------
Ralf Martin
Principal Consultant
infologistix GmbH
Bregenz
Original Message:
Sent: Fri June 14, 2024 01:51 PM
From: Ricardo Lopez Pacheco
Subject: Change connections dynamically in Jobs with odbc connector
Hello everyone,
I have some questions about how to use dynamic connectios in odbc connector if its possible and how on a parallel or secuential job in datastage.
I have 1 parallel job that only insert and update rows in one table (passtrough)
This parallel job in his structure is
one odbc connector as source (this source always is the same) and one target odbc connector, but i have 300 target with the same table but different connection.
I dont want to duplicate the same job up to 300 jobs
Also i dont want to add up to 300 odbc connector in 1 only one job.
I want to know if its possible to use only one job and dynamically change the target connection and how to.
In this scenario with just 1 job I can send the same information to 300 targets.
Regards.
------------------------------
Ricardo Lopez Pacheco
------------------------------