If your goal here is to do this for parallel processing then I would heed to what others have mentioned above…to use a concurrent trigger. If you are really exploring how to use the Java API to achieve this then here is my feedback.
The issue is not with your Java Service. It is with the pipeline variables that are getting carried over from the Flow. It seems the variables that you want to use in the pipeline are not there. A simple test is to print out the pipeline in the Java Service to see if you are able to see them. I just tried below code and there seems to be no restriction on cloning pipeline with Hash tables.
I would really look at the variables during runtime to make sure they are getting populated correctly. If they are then check on the receiving end of the thread invoke to see if the variables are coming in the pipeline correctly. You can call the trace pipeline on that service on the first step and confirm the same.
// pipeline
IDataCursor pipelineCursor = pipeline.getCursor();
pipelineCursor.destroy();
try {
IDataUtil.put(pipelineCursor, "testOut", IDataUtil.deepClone(pipeline));
} catch (IOException e) {
// TODO Auto-generated catch block
IDataUtil.put( pipelineCursor, "errorMessage", e.toString());
}
pipelineCursor.destroy();
#Service-Designer#Integration-Server-and-ESB#webMethods