Locking objects and workflow challenge
First, I want to thank everybody who has contributed to my postings over the last few weeks; you have made a big difference to my understanding of the technology.
Now, the latest problem :-). I have a workflow processing IDocs; one of the tasks is to create a transfer order from a delivery. When the transfer order is created, it locks the material (standard SAP behaviour), and my workflow task raises a temporary exception. After three attempts, an exception will happen that causes a new task to force the user to create the transfer order in the foreground.
Here's the problem: it will happen that several IDocs will be transmitted almost simultaneously, and the first task will get the lock on the material, leaving the others to wait five minutes until the background restart job kicks them off again. Problem is, next time, only one will get the lock again, and the others will wait for the nect restart job execution. Most of the tasks will exceed the three try limit, and go to the foreground task. Since there are a few hundred orders each evening for the same material, this is going to be a lot of tasks to process in the foreground, and the customer will not accept that (and neither would I).
Due to the traffic, I cannot wait more than a few minutes before successful processing of the transfer order.
Soooo, I thought that I would create my own enqueue object based a structure which contains just the material as a data element, and in the method before creating the transfer order I would enqueue the material. Also I woiuld use a WAIT state in the enqueue, and do this in a loop for say 10 times max. When the task gets the lock, it processes the transfer order, then releases the lock in the method. IF it cannot get the lock for some reason, then the temporary error condition still arises, and so it tries again when the next error restart background job runs.
But I have another subtlety; I have to actually lock two materials (the order is in fact for a telephone and a SIM card, and both must be processed in the same transfer order). Both locks are attempted to be grabbed in my own function module, if both cannot be grabbed, then the existing lock is released.
So I could end up in a situation where different tasks are always trying to lock both materials, but because one task has a lock on one material, and another task on the other, both tasks fail to lock both materials.
Now I'm thinking what I could do is (if I can figure out a way to do this) is not to lock the material, but to lock the task itself with it's own enqueue lock object.
Then what I could do in the method is pass the task number (not the ID of the task but the ID of the task type, eg 'TS904000032', and lock the task before proceeding with the execution of the part of the method that created the transfer orders (by and large it will be the other tasks of this type that are locking the material). I could apply the same retry logic in the method, which would be a safer option.
So, question is, in the workflow, is there a parameter that exists to identify the task type?
And the other question is: does this seem reasonable? Have others come up against such situations before?