Let me add a possible solution path as well:
Our solution is a file processing platform that runs on top of IS and handles large files. All other IS functions are still available. For instance, if you are using Trading Networks, you can still use that for processing the EDI content in the same manner you do now.
A possible solution could be to configure DataFeedr to pre-process the EDI file and break it into smaller sets of 385 documents in smaller envelopes (or even singles). To confirm that all data is processed, we build up a reconciliation report while processing and send that after processing completes. Such a reconciliation report can also be processed automatically to find inconsistencies. The smaller envelopes will go to TN for regular EDI handling. And TN can still handle the correlation with the relevant 837’s.
If errors occur, the erroneous 385’s will be kept separately for inspection and possible resubmission. Or an error can start an alternative processing route or even a BPM process.
Our solution allows full control over memory and CPU usage so that you can process these files even during peak hour. And if your file sizes grow, that will only impact the duration of processing, not the memory or CPU usage.
Hope this helps, Christian
#B2B-Integration#edi#webMethods