on 03-03-2009 5:34 PM
Hi all,
My scenario is FILE --> JDBC, everything is ok but, My query contain multiple insertion or update. If there is an error during the process (insertion or what ever) all stop.
I do not want that behaviours, I would like my JDBC Adapter carring on the process even if there is some insertion failed !
How can I do that ?
Thanks
The solution is :
in the configuration perspective, when you create the interface determination, uncheck the section Quality of Service.
Regards
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Joseph,
I have faced this situation and i have noticed the following two things.
Option 1) Database tables have checks consider like null not allowed etc
In this case the table update fails fully until the data that needs to be updated is maintained.
Option 2)Database tables have no checks.
Partial updation is possible.
Rgds
joel.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
if you want this functionality, then i can suggest you a workaround for this..........in your interface mapping and msg mapping change the occurance of target msg to 0..unbounded and in the target JDBC structure have the main node(containing action, table) with occurance 0..1 so that after mapping you have multiple JDBC msgs for one single file............in this way all JDBC msgs will be different containing only one record....so only error records will not be inserted and the other records will be inserted.
Regards,
Rajeev Gupta
Edited by: RAJEEV GUPTA on Mar 4, 2009 6:08 AM
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
Thanks to answer.
I've checked out what you advice me. I have change the occurence of the Interface Mapping, but in the Msg Mapping I cannot do anything like that. So I look around the Data Type but the root node cannot be modify regarding the occurence.
Here is my DT :
<DT_ROOT>
<stmt1>
<dtname @action="SQL_DM"/>
<access/>
</stmt1>
<stmt2>
<dtname @action="UPDATE_INSERT"/>
<table></table>
<access>
<col1/>
<col2/>
....
</access>
</stmt2>
</DT_ROOT>
regards
Hi,
As per my opinion, creating multiple messages from single file and then sending them to Database for insertion/updation is not a very good solution.
Example: Suppose, in your file there are thousand records then your mapping program will convert then into thousand different messages and then it will hit the database thousand times which never a good solution. The performance will go down very much in case of huge amount of data.
Secondly database should update the data if data is correct and it should throw an error message for corrupted records.
Regards,
Sarvesh
Good, I didn't know that (i'm new in PI)
So after that change all my mapping crashed down so I recreated it.
Now, the message input has been changed by added those nodes :
<Messes1>
<message>
<DT_xxxxxx>
But in my file convertor its become the big mess. It cannot interpret the new message format
My data type in imput is :
<MT_2330003_TrainingCatalog_Request>
<recordset>
<training_catalog>
<field1/>
<field2/>
</training_catalog>
</recordset>
</MT_2330003_TrainingCatalog_Request>
Here is my configuration :
Document Name : Messages
Document NameSpace : http://sap.com/xi/XI/SplitAndMerge
Document Offset :
Recordset Name : massage1
Recordset NameSpace :
Recordset Structure: MT_2330003_TrainingCatalog_Request,recordset,training_catalog,*
Recordset Name : massage1
In properties :
MT_2330003_TrainingCatalog_Request.recordset.training_catalog.fieldSeparator = ;
MT_2330003_TrainingCatalog_Request.recordset.training_catalog.fieldNames = field1,field2,....
MT_2330003_TrainingCatalog_Request.recordset.training_catalog.processFieldNames = fromConfiguration
But the error is :
Conversion initialization failed: java.lang.Exception: java.lang.Exception: java.lang.Exception: Error(s) in XML conversion parameters found: Format error in 'xml.recordset' argument: 'recordset' is not valid
My query contain multiple insertion or update. If there is an error during the process (insertion or what ever) all stop.
As per the structure shown by you, if you use multiple Statement nodes, then each node will behave like an independent transaction and won't be stopped by the other node related error. So in your case, you may achieve you case with multiple Statement tags.
If suppose you are inserting data to same table using multiple insert operation. Here if you use multiple access nodes, then they will be part of same transaction and in this case either all of then would be successful or all will fail.
Regards,
Prateek
Hi Joseph,
Havent done anything like this... neither i find any blog or answer to this query. But i feel you can give it a try using Stored procedures. You can handle inserts in stored procedure. And if you can receive back the records in error then your query is solved.
This i think is a good approach, and will be useful to follow.
Regards
Arpil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi ,
for eg if you have picking up 5 rows using file adapter , out of which 2 rows are having invalid data then you don't want to get process interrupt while inserting other 3 rows valid data by JDBC adapter in same transaction. Is it right ?
If it is then i think the JDBC adapter will always throw error for the records which are not getting inserted or updated but other rows will get inserted with out any problems.
Regards
Ashwin kumar Dhakne
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
88 | |
10 | |
10 | |
9 | |
6 | |
6 | |
6 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.