cancel
Showing results for 
Search instead for 
Did you mean: 

Maximum number of records to 'BAPI_PIRSRVAPS_SAVEMULTI'

Former Member
0 Kudos

Hi All ,

Could anybody tell me maximum number of records that can be passed to BAPI

BAPI_PIRSRVAPS_SAVEMULTI.

This BAPI is used for forecast upload to SNP planning area (which can be seen in product view: /sapapo/rrp3).

Win full points for the resolution...

Thanks in advance...

Chandan Dubey

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi Chandan - There is no simple answer to this question.

BAPI_PIRSRVAPS_SAVEMULTI has a built in package (number of records to process) counter which sends packets of data to livecache for creating data. By default this BAPI will process all records at once but there is a BADI in this BAPI that allows you to set the package size as well as many other things. The performance will depend upon things like your system, environment and volume of data. There are 2 limitations in 1) the prereading (retrieval of matlocids, matids, locids, pegids, etc.) which happens prior to the livecache call and 2) the livecache call itself. The prereading can cause a memory overload but that is less likely to happen compared to a livecache problem. The proceduress that call livecache can run out of more likel than the ABAP tables and cause the program to dump as well and the dump may be hard to understand.

What I have done with many programs is to add a wrapper around a livecache BAPI (or FM) call and use my own counter to send blocks or packets of data to the BAPI. For example loop through records in a program and call the BAPI for every 1000 records accumulating the return info in an internal table. The number of records in each packet or block is driven by a parameter on a selection screen or value in a ztable so the number can be tested and adjusted as needed. The reaction of livecache BAPIs will differ from system due to things such as hardware configuration and volume of data.

If you do not code to call the BAPI as I have described above, place code in the BADI to set the packet size or limit the number of records being input some other way, then you are taking a risk that one day a specific number of records will cause a dump in this BAPI.

I would think you would be safe with 500-1000 records but you should really test in your system and consider the options for packeting the number of records.

Andy

Former Member
0 Kudos

Hi  Andrew,

I am also facing the same problem.

Currently my file having 65 thousands of records, while passing to BAPI it is working fine.

But it is taking more than one hour to update to the live cache.

As you are suggested how to pass  packets or blocks of  of data at time to BAPI.

Is there any limitations of Data passing to this BAPI or Any settings inside BAPI required.

If possible give me the sample code how to proceed.

Can you please let me know how to proceed to reduce the time, It's little urgent.

Kindly do the needfull.

Win full points for the resolution...

Thanks in Advance,

Venkat,

Mail id : pvramana345@gmail.com.

Answers (0)