IBM webMethods Hybrid Integration

IBM webMethods Hybrid Integration

Join this online group to communicate across IBM product users and experts by sharing advice and best practices with peers and staying up to date regarding product enhancements.

 View Only
  • 1.  Large file handling in Broker

    Posted Wed October 05, 2016 07:49 AM

    Hello Experts,

    Could you please explain the process for how to handle the large file handling by using Broker.

    Thanks,
    Prem.


    #Integration-Server-and-ESB
    #webMethods
    #Universal-Messaging-Broker


  • 2.  RE: Large file handling in Broker

    Posted Wed October 05, 2016 09:53 AM

    Did you mean by using node iterator for handling large xml or flat file (use iterate=true) and publishing each chunk to broker and then process it?

    Can you explain what you mean by “how to handle the large file handling by using Broker.” ?


    #Integration-Server-and-ESB
    #Universal-Messaging-Broker
    #webMethods


  • 3.  RE: Large file handling in Broker

    Posted Tue March 28, 2017 03:53 PM

    Hi,

    Could you please help me in handling large files like 17 GB in webMethods using pub-sub model.
    I have read through the bolgs that we can process it using chunks. Can anyone guide me through the steps, or pointers which can help me.

    Thanks


    #webMethods
    #Integration-Server-and-ESB
    #Universal-Messaging-Broker


  • 4.  RE: Large file handling in Broker

    Posted Tue March 28, 2017 05:35 PM

    There are different ways of doing this. If xml files you can use xml node iterator and process them node by node in the file. If flat file you must use ff iterator with ff schema and dictionary created in first place.

    The above details can be found from the built-in services and flat file adapter guides.

    Also look into the large file handling capabilities provided out of the box with BigMemory Terracotta. Review Integration Server Administration guide.

    Last resort is to go with a custom java code which splits the big files into chunks (stream or bytes) and then process these chunks via pub sub.

    Altogether you will end up building a large file handling framework which includes staging tables, processing services and reconciliation services if required.


    #Integration-Server-and-ESB
    #webMethods
    #Universal-Messaging-Broker


  • 5.  RE: Large file handling in Broker

    Posted Fri February 12, 2021 11:08 PM

    Good morning,

    I am interested in hearing if this problem was ever resolved and how?

    Thanks!


    #Universal-Messaging-Broker
    #Integration-Server-and-ESB
    #webMethods


  • 6.  RE: Large file handling in Broker

    Posted Sun February 14, 2021 07:51 AM

    My resolution would be a redesign. IMO, pubsub is not well suited to handle messages of several GB in size. I know of no messaging infrastructure capable of handling this. I’d transfer smaller messages with references and pass the net load on some other way (FTP, S3, …).


    #webMethods
    #Integration-Server-and-ESB
    #Universal-Messaging-Broker


  • 7.  RE: Large file handling in Broker

    Posted Sun February 14, 2021 08:10 AM

    Your point is valid. Broker is a messaging facility and messages should be used for “signal transport”. Not for (sizeable) data transport.

    We have implemented a solutions for a customer that tried to send data sets of between 25 and 100 MB in size (and sometimes larger). Their initial implementation was using Broker. I can tell you that that did not work too well.

    The final implementation uses DataFeedr to split the file into manageable (and predictable) units of work and pushed them onto the Broker not only safeguarding against killing the Broker system but also allowing them full control over throttle, memory usage, thread pool usage, etc.

    If this is still a problem, happy to share the details.

    Christian


    #Universal-Messaging-Broker
    #webMethods
    #Integration-Server-and-ESB