cancel
Showing results for 
Search instead for 
Did you mean: 

Processing large volume of idocs using BPM Processing

Former Member
0 Kudos

Hi,

I have a scenario in which SAP R/3 sends large volume say 30,000 DEBMAS Idocs to XI. XI then sends data to 3 legacy systems using jdbc adapter.

I created a BPM Process which waits for 4 hrs to collect all the idocs. This is what my BPM does:

1. Wait for 4 hrs Collect the idocs

2. For every idoc do a IDOC->JDBC Message transformation.

3. Append to a Big List

4. Loop at the Big list from step 4 and in the loop for

5. Start counter from 0 and increment. Append to a Small List.

6. if counter reaches 100 then send a Batch JDBC Message in send step.

7. Reset counter after every send.

8. Process remaining list i.e if there was an odd count of say 5300 idoc then the remaining 53 idocs will be sent in anther block.

After sending 5000 idocs to above BPM following problems are there:

1. I cannot read the workflow log as system does not respond.

2. In the For Each loop which loops through the big list of say 5000 idocs only first pass of 100 was processed after that the workflow item is not moving ahead. It remains in the status as "STARTED" but I do not see further processing.

Please tell me why certain Work Items are stuck is it becuase I have reached upper limit and is this the right approach? The Main BPM Process is also hanging from last 2 days.

I have concerns about using BPM for processing such high volume of idocs in production. Please advice and thanks in advance.

Regards

Ashish

Accepted Solutions (0)

Answers (6)

Answers (6)

STALANKI
Active Contributor
0 Kudos

chk this blog out../people/sravya.talanki2/blog/2005/12/29/loss-of-messages-in-ccbpm-sp12

and also check in the tcode sfpa what is amximuum nunber of workflow items and subwork flow items.

Former Member
0 Kudos

Hi Ashish,

Please read SAPs Checklist for proper usage of BPMs: http://help.sap.com/saphelp_nw04/helpdata/en/43/d92e428819da2ce10000000a1550b0/content.htm

One point i'm wondering about is why do you send the IDocs out of R/3 one by one and don't use packaging there? From a performance stand point this is much better than a bpm.

The SAP Checklist states the following:

<i>"No Replacement for Mass Interfaces

Check whether it would not be better to execute particular processing steps, for example, collecting messages, on the sender or receiver system.

If you only want to collect the messages from one business system to forward them together to a second business system, you should do so by using a mass interface and not an integration process.

If you want to split a message up into lots of individual messages, also use a mass interface instead of an integration process. A mass interface requires only a fraction of the back-end system and Integration-Server resources that an integration process would require to carry out the same task. "</i>

Also you might want to have a look at the IDoc packaging capabilities within XI (available since SP14 i believe): http://help.sap.com/saphelp_nw04/helpdata/en/7a/00143f011f4b2ee10000000a114084/content.htm

And here is Sravyas good blog about this topic: /people/sravya.talanki2/blog/2005/12/09/xiidoc-message-packages

If for whatever reason you can't or don't want to use the IDoc packets from R/3 or XI there are other points on which you can focus for optimizing your process:

In the section "Using the Integration Server Efficiently" there is an overview on which steps are costly and which steps are not so costly in their resource consumption. Mappings are one of the steps that tend to consume a lot of resources and unless it is a multi mapping that can not be executed outside a BPM there is always the option to do the mapping in the interface determination either before or after the BPM. So i would sugges if your step 2 is not a multi mapping you should try to execute it before entering the BPM and just handle the JDBC Messages in the BPM.

Wait steps are also costly steps, so reducing the time in your wait step could potentially lead to better performance. Or if possible you could omitt the wait step and just create a process that waits for 100 messages and then processes them.

Regards

Christine

Former Member
0 Kudos

Hi Ashish

Collection of huge no of idocs will have memory issues, you can write IDOC's to a file port from SAP instead of directly receiving them through RFC connection. From a File/FTP adapter you can read them at specified time to process, even you can avoid BPM if there are no other logic apart from sending the same data to 3 systems.

Regards

Prahllad

Former Member
0 Kudos

Hi,

Think Ashish is on the right track, just saw a presentation where the bottlenecks were discussed on performance, seems like it all boils down to CPU time (BPM!), files to be dragged from one to the other system can easily be 20 Mb each, no problem. Furthermore it is always advisable when high volumes are in the picture to use Proxies (native XI format) i.s.o. RFC/IDOCS.

Regards,

Marco

Former Member
0 Kudos

Hi,

Looks like there's an issue with the counter you are using. Maybe it is not being reinitialised properly.

But using a BPM for such large number of idocs is usually not advised.

Regards,

Smitha.

Former Member
0 Kudos

Ashish

If you use other than IDoc like RFCs etc then that would be good because in your scenario you are also using BPM apart from huge load. This may effect the performance of the interface once you go live in Production system.

---Mohan

Former Member
0 Kudos

Also wanted to mention that I have applied OSS Note: 72873

and OSS Note : 888279 ( option 1)

Thanks

Ashish

moorthy
Active Contributor
0 Kudos

Hi Ashish,

Just my views-

Try to change the waiting hours or try to go with as mentioned in the blog-

/people/sravya.talanki2/blog/2005/12/09/xiidoc-message-packages

- Second one is, is it possible to go with collection based on messages not with either time dependent or no of messages. For this you may need to use Proxy.

It is just my views

Regards,

Moorthy

Former Member
0 Kudos

Hi Ashish

I think you should better attempt to compress

IDOC metadata size.

->In Integration Repository. you can export compressed

IDOC xsd. and use this xsd at Integration Process

(data type is abstract).

or this issue maybe problem of java memory(mapping).

->increase java heap size by configtool.

regards

Yuki Fujioka