You may want to use the out of the box SFTP delivery bps as a starting point for your version. Building this to use a single thread (1 session) will be easiest to start with. The logic to use two threads will be more complex dealing with two lock files, so do that in version 2 after you have version 1 completed. Be aware that if you have 100 files all at once you will get 100 delilvery BP all at once so teh amount of time you sleep wiating for the lock needs to be appropriate or you will have to replay them if they fail.
When you complete this custom protocol process you can even use the standard replay functionality and all other out of the box SFG features.
Original Message:
Sent: Thu June 15, 2023 04:38 AM
From: RICHARD CROSS
Subject: FileGateway Concurrent Sesions
Hi Oscar,
Take a look at the BPs for MQ FTE delivery:
FileGatewayDeliverWMQFTE - initiates the delivery via MQ FTE posting a "Delivering" status update
FileGatewayProcessFTEReply - is bootstrapped from the async reply queue where MQ posts the result of the transfer. It updates the delivery info to show success or failure of the transfer
You could do something similar e.g. have a custom delivery process that moves the files to a mailbox, maybe logs the msg id, delivery key and route metadata to a custom table, and reports delivery pending (you can create your own custom event code to make it clear). The final thing the MQFTE delivery BP does is:
<assign to="FG/AsyncDelivery">true</assign>
which maybe tells SFG that delivery isn't complete (been a while since I played with this so not 100% sure).
Your scheduled process can then loop through the mailbox or your custom tables to kick off however many threads to do the SFTP delivery, updating or deleting the rows from the custom table, and reporting back the transfer result via the FileGatewayRouteEventService.
Regards,
------------------------------
RICHARD CROSS
Original Message:
Sent: Tue June 13, 2023 12:31 AM
From: Oscar Javier Duran Lopez
Subject: FileGateway Concurrent Sesions
Hi Laurie,
Yes , This is a posible solution, but I have a problem because the status of the file in IBM FileGateway is Delivered because the file was send to mailbox. If later other BP search the files in mailbox and send one to one via SFTP and any file failed in the deliver , the status in IBM Filegateway not update for failed deliver because the status"delivered" was show in the initial send. The client require see the correct status for the final deliver and have the possibility to replay the file.
Do you have other option for solve this case ?
Thanks.
------------------------------
Oscar Javier Duran Lopez
Original Message:
Sent: Fri June 09, 2023 09:39 AM
From: Laurie Sibbett
Subject: FileGateway Concurrent Sesions
If I am understanding you correctly, Sterling is sending each file under new connection/session, which is a default configuration.
Option 1: Ask the consumer of the files to pick up files from Sterling.
Option 2: You can do this in more than one way but the best way I know of would be to send all files to a mailbox (or the file system - depending on file sensitivity and location) that doesn't automatically deliver to the consumer (no channel).
Then, on a schedule with an interval that works for everyone (15 min., 1 hour) run a process that locks the initiating starter BP (containing the variables for the process).
Therefore, here is a high level summary to give you a good idea of the framework.
1. Starter BP name of maybe, DeliverFiles_Starter
2. Inline invoke the main BP.
3. Run Lock Service to check for locks on that starter name
If not Locked then proceed to lock, otherwise end gracefully
4. Extract oldest file from mailbox, if there is a file to send then
5. Open Begin session to connect to partner
6. Change directory, if needed
7. Put file on remote host
8. Commit the mailbox extract
9. Release file loop until all files have been sent
10. Close with End Session
11. Unlock starter BP
Note: If you lock the BP you will prevent another instance executing while you are in the process of sending.
Best of luck,