Hi Brett,
Not completely sure what the case is you are trying to solve, but I think you may want to consider your architecture. If you are receiving binary files, and (as you said) you would like to receive multiple files concurrently, I think you should try to write the file to disk asap once received, to ensure the release of the thread. Once on disk (even in a temp dir) it will be much easier and more efficient to process the file. Additional benefit: includes archiving.
To write a file to disk from the FTP input, you need a java service that takes any sub class of Inputstream (aha, such the FTPInputStream), and writes it to a file location.
Something like this:
// Get params
IDataCursor pc = pipeline.getCursor();
InputStream s = (InputStream)IDataUtil.get(pc, “contentStream”);
String f = IDataUtil.getString(pc, “filename”);
// TODO: check whether file is writable, etc
// Try to write stream to file, fiddle with the buffer size for performance
try {
FileOutputStream fout = new FileOutputStream(f, true);
int pos = 0;
int read = 0;
byte buffer = new byte[8192];
while ((read = s.read(buffer, 0, buffer.length)) > 0) {
fout.write(buffer, 0, read);
pos += read;
}
s.close();
} catch (Exception e) {
throw new ServiceException(e.toString());
}
// Output nr of bytes written to file
IDataUtil.put(pc, “bytesWritten”, String.valueOf(pos));
pc.destroy();
After running this, the stream is closed and can be dropped. If you would use a broker to submit a notification to a trigger, the current thread can return a confirmation to the FTP client, and a separate thread can start the processing of the file. Setting the trigger to concurrent easily allow parallel processing.
Hope this helps!
Chris
#Integration-Server-and-ESB#webMethods#webmethods-Protocol-and-Transport