RKK,
Well to be completely honest there are about a million different ways to do what you are doing in the Integration Server. Typically when we approach a pub/sub situation we do the following:
-
Extract your source data via whatever mechanism works, ie jdbc notification, ftp, web service, scheduled job, TN … whatever works.
-
Publish that data into your broker in its raw format (**Note it is possible esp. in the case of web services that you data may be already in a canonical format).
-
Have a subscribing service retrieve the document via a trigger and format the data into your canonical document.
-
Publish you new canonical format.
-
Have your subscription services pick up the document via their triggers and convert the data into the format they need for the target system.
This pattern can be repeated for most pub/sub situations. I would argue as well that it can be and should be used in point to point situations as well. You have isolated your source system from your target system and made it easy to handle adding more subscribers later on by putting in this canonical layer.
Performance : The Integration Server and the Broker are both high performance components. Where you might run into trouble is publishing really large documents into the broker. I would suggest using a more event driven approach instead of batching them up if that is possible. In your case this could be accomplished using the JDBC notifications. Also the degree of transformation and data enrichment that the data goes through will also have an affect on performance. When properly constructed the Integration Server and Broker can handle millions of transactions per day on a relatively small platform.
Of course you could do all the extract transform and then load in a single flow service. But then again if you do it that way, you probably didn’t need to purchase the Integration Server and broker to start with. It’s a pretty expensive batch tool.
markg
http://darth.homelinux.net
#webMethods#Integration-Server-and-ESB#Flow-and-Java-services