cancel
Showing results for 
Search instead for 
Did you mean: 

SNP planning area extraction to flat files

former_member229994
Participant
0 Kudos

We need to have parallelization in extraction of data from SNP . to get different flats files from extraction.

What is the easiest way ? In doc, I've seen that we have only one extraction structure per planning area. If I create different SNP planning versions for each "envelop" (to get one flat file per version ), will I get "parallelization" ? or do I need usage of BAPI"s for extraction ?

Kind regards,

Christine

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Christine,

Before we discuss the various options available for generating files parallelly, we need to understand the load it forces on the system. Whenever we extract data from the SNP planning area, it has to reach the master data from infocubes and then the transaction data from the Livecache which is again a separate server.

Eventhough we try and run multiple extracts in parallel, we have noticed that the reads have a wait on both LC and Database (Infocubes) and the extraction is delayed more than normal.

Food for thought - You can extract using a datasource using parallel processes(available while creating datasource) into a infocube. Once the data is available you can then schedule multiple extracts into files. Even in this case, there is a restriction on the number of parallel queries the database (infocube) can take.

Hope this helps.

Thanks

Mani

former_member229994
Participant
0 Kudos

Mani Suresh,

Thanks. So, if I have well understood, when doing extraction, I enter in the BW process, but I cannot create directly an infosource " excel" , but I need a "remote infocube" and then extract, with low performance , data in different infosources, eg excel files

I have been told that a "special" BAPI was possible that was even able to use data from SNP Planning area and from the Supply Network data . I have not found it in the BAPI transaction. Do you know about this "special" BAPI ?

Kind regards

Christine

former_member229994
Participant
0 Kudos

Mani Suresh,

If I take the option of extraction to remote infocube, can this remote cube be fulfilled with data coming from :

1/ several Demand Planning Planning area

1/ one Global SNP Planning area with xx versions

2/ time series ( Global SNP Planning area with planned safety stock so , data stored in time series.

If I take the option of "special" BAPI, I researched on the criteria SBP Sel* ( I have been told something like "special" (????) BAPI

If you say me that the first option has not good performance, I try to get more info on the second option but I get a lot of BAPI based on this criteria.

Kind regards

Christine

Former Member
0 Kudos

Hi Christine,

About BAPIs for extracing order data:

Please check if following BAPIs for extracting inhouse planned ordres and procurment orders in APO SNP.

1.ProcurementOrderAPS

BAPI_POSRVAPS_GETLIST3 (Read Procurement Orders for Selection Criteria - With Characteristics)

2. ManufactOrderAPS

BAPI_MOSRVAPS_GETLIST2

Read Manufacturing Orders for Selection Criteria

Please check and confirm.

Regards

Datta

Former Member
0 Kudos

Christine,

Always BAPIs are slower and have very poor performance. I am not aware of the BAPI you are referring to here. Plng Area -> Infocube -> Files should be the way to go.

For option no.1 it is not necessary you need to create a remote cube. Remote cube pulls data from LC everytime you use it. I would suggest creating a normal BW infocube.

This should be the structure.

Datasource(from Dp, SNP plng area, not sure abt the order data need to check but there shud be a way). --> Infosource --> Update rules --> Infocube.

Once you have the data loaded into the infocube, then you can schedule multiple extracts into files from that. This way the load on LC(which is what causes slowing down) is reduced and all data can be extracted out of Infocubes.

Hope this helps.

Thanks

Mani

former_member229994
Participant
0 Kudos

Thanks,

Your answer is very helpfull because you give examples of BAPI enabling to fiind the good ones for the option"with BAPI".

I give more point to Mani suresh because he clearly explain that the solution through Infocube is better because of Live cache slowing down effect.

Regards,

Christine

former_member229994
Participant
0 Kudos

Thanks Mani Suresh

You have exactly answered my question explaining what cause slowing down ( => avoid BAPI usage)

Kinds regards,

Christine

Former Member
0 Kudos

My pleasure to be of some help.

Thanks

Mani

Answers (1)

Answers (1)

Former Member
0 Kudos

Christine,

Try using Infospokes and OpenHub destinations. You can access them suing Tx RSBO. You can create multiple infospokes with different target file names and each with a different selection criteria. You can then sue a process chain to trigger them in parallel. Furthermore you have the functionality of using a BADI to make ammendments if any required.

Its a very simple process and if data alterations from teh PA are not required (i.e. BAPIs are not required) can be accomplished very quickly.

Let me know if this helps.

Cheers!

Abhi

former_member229994
Participant
0 Kudos

Abishiskek,

I have looked doc of "Open Hub Service" but the source objects seem to be SAP BW.

Our source in version SNP Planning area ( order based live cache, not "BW" ).

The options I would like to evaluate are:

1/ from version/SNP Planning area, generation of only one extraction structure. Based on this unique extraction structure, several infosources creation to be able to generate several excel files , one per version.

2/ Usage of BAPI ( which one ?) to be able to research both information in Data mart ( information stored at product/location level , for example SNP Planner) and in planning area and to generate flat files used afterwards in our ETL.

Those two options does not use "Open Hub service neither the transaction you give ( I got the message "obsolete" on transaction you give)

Kinds refards,

Christine