Issue with Database Commit
I have two include in my user exit of the sales order.
In one include I am doing some validation on the Order quantity of the sales order depending upon the order quantity in one of my Z table. If every validation is OK then I am exporting this order quantity to memory ID.
In the second include I am importing this Order Quantity and updating this Quantity in my database in my same Z table.
If I process one order at a time then everything is working fine but when I try to create two sales order at the same time then only second order is getting validated and order quantity is getting updated whereas for the first order no validation is performed and also order quantity is not getting updated in the Z table.
I really do not know is this issue related to database commit or something else.
Chendil Kumar replied
Sachin yadav :- I am not changing the order quantity in the order. By validation I mean to say if there is not sufficient Quantity in the Z table the it will put the header block in the sales order it doesnu2019t change the Quantity in the actual sales order.
I was kind of hoping that this wouldn't be the case and you just wanted to update the Z table
Anyways, below are my thoughts on a possible solution, now the issue we are trying to address is the case where multiple orders are being created in parallel, and the values from the previous order(which is used for some validation in the current order) not getting updated into the database fast enough.
The solution is based upon using "Data Clusters", as the response time in fetching the data to/from a data cluster is considerably less and the data is stored persistently. Go through the link below and play around with the test program provided to understand the intricacies of this solution.
I would suggest you to code something on the below lines in the user exit.
Below is just a pseudo code and you can modify this as per your requirement,
Import itab to li_itab from database indx(AB) ID 'SOME_ID'. if sy-subrc eq 0. read li_itab using the key values. validate modify the li_itab with the updated values export li_itab to database indx(AB) ID 'SOME_ID'. else. select the entry from the Z Table for the key fields in concern. read li_itab using the key values. validate append li_itab with the updated values for this key export li_itab to database indx(AB) ID 'SOME_ID'. endif.
You can have your correction program adjust the values and update the data clusters in a background job at the EOD.
Also, once this is done you can have your Z Table as a buffered table, as you will be updating that only once in a day.
Now the only concern i see here is the growing size of the data clusters, as the key fields in your Z table are as below the total no of records could possible be running into several thousands(depending on your material master) an approx would be your the "no of sales areas" X "no of materials".
ZMANDT Client (Key) ZVKORG Sales Organization (Key) ZSCYC1 Sales Cycle 1 (Start Period) (Key) ZSCYC2 Sales Cycle 2 (End Period) (Key) ZBZIRK Sales district (Key) ZVKBUR Sales Office (Key) ZMATNR Material Number (Key) ZQTANO Quota Number (Key)
I can think of an approach to overcome this too, all that you will have to do is come up with a logic to create a user defined key(upto 22 characters) using a proper combination of the above key fields. This should limit the records to a few thousand.
i would say wait for the experts on the forum to comment on this solution, if you don't get any response back on this post, you can post the below solution as a question to get their feedback
Edited by: Chen K V on Jun 8, 2011 10:46 AM