on 02-18-2008 8:57 PM
We have a big file with more than 20000 records, that needs to be loaded onto database table.
Should we split the files into multiple messages (1000 records each) and try to insert into Database table? Or Should we insert all record in one stretch.
If we follow the first approach, how do we go about restarting the message if there are any insert failures?
Hi,
> Should we split the files into multiple messages (1000 records each) and try to insert into Database table? Or Should we insert all record in one stretch.
1) From the Source side you can split and send
2) Multiple messages is advicable
Shall we know the Source ?
Regards
Agasthuri Doss
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Kris,
If there is possibility then I would suggest you to go for Splitting the file from the source. Then send it accordingly. Also, note that when u are trying to do insert 1000 records, if it's failed at 1000 record, then everything else will rollback. I would suggest you to use Synchronous in the Inbound side.
raj.
Kris,
Since you are using Content conversion, just set Recordset per message for ex 1000 and do the rest of the stuff. Do u want to see the error in SXMB_MONI if it fails? If yes then make it as Synchronous Inbound.
The best process as mentioned above
- Split the file.
- Create Asynh O/B and Synch I/B
If you finalize the above steps then u can go ahead with mapping.
raj.
Kris,
Are u going to use BPM? Is it one time Mass upload or this data is expected all the time?
raj.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
86 | |
10 | |
10 | |
9 | |
7 | |
7 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.