“Also I am forming the document list(By appending individual doc in each iteration). Once i get the complete list, I am doing batch insert.”
That is not a good approach for this many records.
The suggestions we are making are the same as your colleague–chunk the file. That is:
Read 100 records. (or 500 or 1000 or whatever works)
Write 100 records in one batch insert call.
Repeat until done.
Do not use Vector, or appendToDocumentList or any collection that reallocates memory when the capacity needs to grow. Use LinkedList or similar to collect the 1000. Then get the array for that list just before doing the batch insert to the DB.
I wouldn’t try multiple threads/processes for processing the file. You’ll spend a lot of time and effort trying to get that to work in a coordinated way.
Learning how to use SQL*Loader in a discussion forum probably isn’t the right place. You can search for the docs on the web for that. Or get info from one of your DB people.
#Flow-and-Java-services#Integration-Server-and-ESB#webMethods