cancel
Showing results for 
Search instead for 
Did you mean: 

Processing Idocs in SAP ECC in order

AntonioSanz
Active Participant
0 Kudos

Hello all,

I've got a File --> BPM --> IDOC scenario. I am sending Purchase Orders IDOCs to my SAP ECC system.

I have notice that for every file I send to my integration scenario, I create several Purchase Orders IDOCS which are sent to SAP ECC "inside" a block (so I have configured IDOC adapter). QoS is EO. Inside this block the idocs are procesed in SAP ECC in order.

But if I send 2 files, SAP PI generates 2 differents outbound queues, so 2 blocks of IDOCs are sent to SAP ECC . Inside each block there is a sequencially processing, but both blocks are running in pararel, and in my case,  if I send twice the same file, I create duplicate data (although I have controls in SAP ECC).

I have tried with EOIO, but in this case if an IDOC have a status error the rest of idocs are stopped. I don't want this situation. If an idoc stop, I want the rest of idocs to be processing. ( I can see Idocs in WEINBQUEUE and they are seriallized).

Is there any way to send all the idocs in the same queue but just with EO quality of service? If not, If I use EOIO, how can I configure my SAP ECC system to continue processing IDOCs no matther are previous IDOCs in error status.

Many thanks.

Kind Regards,

Antonio Sanz.

Accepted Solutions (1)

Accepted Solutions (1)

AntonioSanz
Active Participant
0 Kudos

Thanks to all.

My solution to this problem and avoid duplicate orders is: custom Z table on R/3 system and RFC which updates that table with order id.

From SAP PI mapping using a RFC call to that function, I ask if the order has already been sent. If true, I dont send again the order to R/3 system. If it is false, I send the order and update the Z table with the order id.

Kind regards.

Former Member
0 Kudos

Hi Antonio,

I hope you do not do RFC lookup for that!

better approach is to add abap code during Idoc processing in SAP, like:

1. at the beginning of process to control your ZTable to know if this is a duplicate Orders.

if yes, to manage an Idoc Error message like "Orders rejected because this is a doublon".

if no, continue process...

2.at end of idoc process to update your Ztable if Orders has been correctly created.

regards

Mickael

AntonioSanz
Active Participant
0 Kudos

Thanks, we finally decided for a custom ZTABLE and RFC Lookup.

We have also some controls in SAP system, but the idea was not to modify the ABAP code. So that's why decided to implement that control in PI.

Kind regard.s

Answers (3)

Answers (3)

Former Member
0 Kudos

Hi Antonia,

If you are using BPM already then you can enhance the BPM to collect files over period of time and then process these collected files one by one to SAP ECC.

In this way even in case of multiple files, one BPM instance will be created in PI which in turn will send files in the same sequence in which they were collected.

But this solution is applicable if you know how often the files will come and design BPM to handle collection accordingly.

Also this will be performance intensive as you will end up collecting files in single BPM.

Nevertless this is one possible solution.

Regards,

Anurag

Harish
Active Contributor
0 Kudos

Hi Antonio,

you can maintain the order at run time using Interface determination option order at run time.

Regards,

Harish

AntonioSanz
Active Participant
0 Kudos

Thanks to all.

I am still looking for a solution.

ambrish_mishra
Active Contributor
0 Kudos

Hi Antonio,

AFAIK, EOIO is not possible in a BPM. Are you sure you achieved EOIO through BPM?

>>>But if I send 2 files, SAP PI generates 2 differents outbound queues, so 2 blocks of IDOCs are sent to SAP ECC . Inside each block there is a sequencially processing, but both blocks are running in pararel, and

I don't think you can control this. file generation needs to be controlled at the source.

>>>> in my case,  if I send twice the same file, I create duplicate data (although I have controls in SAP ECC).

You can control this at file adapter level. There is an option to run a duplicate check by file name.

Hope it helps!

Ambrish