cancel
Showing results for 
Search instead for 
Did you mean: 

Better to Load Key Figures into Planning Book or Cube?

alex_campbell
Contributor
0 Kudos

Hi Experts,

I'm starting to design a process whereby users can load key figures into DP. We're on SCM v4.1, so the standard "Upload from Excel" functionality in v5 is not available to us. We have a number of users who are responsible for key figures at various locations who will be uploading key figures for thousands of CVCs at a time. I expect that on occasion multiple users will be loading key figures at the same time (although the CVCs should be different).

I've found plenty of info on how to load these key figures into a Cube using a Process Chain. I've also found info on loading into the Planning Book using a BAPI (BAPI_PBSRVAPS_CHANGEKEYFIGVAL2). But I couldn't find any discussion on whether one or the other should be used for large loads. Since the end users have an understanding of the Planning Book, but wouldn't really have an understanding of the Cube, I'm leaning towards using the BAPI. Do you know of any reasons not to do these large loads using the BAPI? Do you know of any reasons why loading into the Cube would be a better choice? In general, would one be faster than the other?

Thanks,

Alex

Accepted Solutions (1)

Accepted Solutions (1)

alex_campbell
Contributor
0 Kudos

Thanks everyone for your helpful advice!

In the end we decided to go with the BAPI (BAPI_PBSRVAPS_CHANGEKEYFIGVAL2) approach. After talking with a consultant with more SCM background than we had, we made a decision based on the following factors:

1. We are more proficiant with ABAP than with Process Chains and Cubes (sometimes you just have to go with what you know).

2. We expect the BAPI will be quicker, because it updates the Planning Area in LiveCache directly. Preliminary testing shows that using a Process Chain for a typical workload takes about 80 minutes, which does not meet the requirements from the business.

3. We expect the BAPI will handle concurrency better. It has logic to retry after waiting if a particular CVC/KF is locked. As I said initially we expect there to be times when multiple users are loading to the same planning book, and even to the same CVC at the same time.

If this all blows up in my face I'll be sure to post my lessons learned back to this thread. Regardless of outcome thanks everyone for your input!

Answers (3)

Answers (3)

alex_campbell
Contributor
0 Kudos

Thanks Prawan and Ada for your helpful answers!

It sounds like loading to the Cube is the preferred approach. I do have one lingering concern though. As I said above it's likely that multiple users will be loading key figures at the same time. They will be loading to the same planning area but the CVCs should be distinct (each one is responsible for different locations). This raises two follow-up questions:

  1. Is it possible to load more than one file to a cube at a time?
  2. Is it possible to load multiple cubes into a planning area at a time?
former_member187488
Active Contributor
0 Kudos

Hello

1.Is it possible to load more than one file to a cube at a time?
-> Sorry I don't know. Someone who knows BW well should be able to answer you.

2.Is it possible to load multiple cubes into a planning area at a time?
-> Before note 1385695 you're able to this, and even with note 1385695, you're able to do this by solution in note 1484063, but we do not suggest you to do so. Note 1385695 adds a exclusive lock to the selections you use in TSCUBE. Unless you do not have locking conflict for the two load processes, you cannot load multiple cubes into a planning area/planning version at the same time.

Besides, I think it's not normal that several users operate on //tscube at the same time. I don't think this transaction should be used like that. Actually the locking logic added to //tscube is to prevent changes from planning book at the same time ... //tscube is a transaction for planning data preparation. You load data into planning area/planning version here, and then users operate the data in planning books ... Maybe you can recheck your process ...

Best Regards,
Ada

former_member209769
Active Contributor
0 Kudos

Hi Alex,

Answers to both your questions is yes.

For 1, you would get multiple requests in the cube for the different loads. So, you could have multiple data for the same CVC in the cube in different requests unless you delete the older requests. If you are not comfortable in BW, discuss this point with a BW expert, else you might end up sending current data and old data for the same CVC to the planning area.

For point no. 2, keep Ada's comments in mind. If you don't take care as per the note 1484063 when 1385695 is applied, then you would be able to run only one load at a time and any other later loads will keep on failing due to data lock. Better to NOT run all the loads from the excel throughout to planning area every time when some user changes the data. Have the data come to the cube as per user convenience or as per some frequent timings, and then load the data from cube to planning area at some FIXED timings (or if possible, once a day). From cube to planning area you might need to split the load if data volume is very high (e.g. our case also). You could apply the relevant notes and then you are able load data in parallel loads. 

Thanks - Pawan

former_member187488
Active Contributor
0 Kudos

Hello,

I don't think you should use the BAPI, if your purpose it to put data into liveCache in order that users can view them in the planning book.

As Pawan said, the standard way would be use a infocube. Loading from excel to infocube sould be a general functionality in BW (but I'm not quite familar with it). Then you can load data from infocube into liveCache use transaction /sapapo/tscube in APO system. This is a general operation in APO.

/SAPAPO/TSCUBE can put data from infocube directly into liveCache, but if you use the BAPI, planning book will be involved, which is like you use interactive planning. System will consider settings of the planning book, dataview, together with selection profile and aggreagted level, even macros ... it will make things complicated somehow. /SAPAPO/TSCUBE is a more pure tool than the BAPI, and it should be the proper tool for you to use.

Best Regards,
Ada

aparna_ranganathan
Active Contributor
0 Kudos

Alex

I dont know the answer to your question - but iam curious as to why you are not transferring data from R/3 and doing macro calculations to derive the value of the key figures the end users need . Is that because you dont have R/3 ?

alex_campbell
Contributor
0 Kudos

Hi Aparna,

The group which is asking for this requirement not fully on our ECC system. Still they want to leverage our SCM system for their Demand Planning needs. I believe there are also some more "predictive" key figures that they would like to load. In cases of figures like forecasts, they feel more comfortable maintaining in an excel file and revising each month. This way they can use their intuition rather than a predefined algorithm.

former_member209769
Active Contributor
0 Kudos

Hi Alex,

The method to get data to cube and then to planning area should be a fool-proof way, as we would be more close to the SAP standard way. SAP BW (within APO also, its the same SAP- BW) provides a standard way to get data from excel to cube, and then you would be using the standard transaction of getting data from cube to planning area.

I could comment that BAPI is more an add-on kind of functionality. I am not really sure whether BAPI would be faster than loading from cube to planning area, especially since loading through cube would involve 2 steps - (a) data from excel to cube & (b) data from cube to planning area.

BAPI might be faster (not sure), but getting data from excel to cube is also something that we do here for few exceptional cases. This is also helpful in case you need to later on re-load data to Planning area in case of data issues in planning area.

PS: To give you a flavour of the "direct upload from excel functionality", it also has limits in terms of how many CVCs you could have selected for upload. If the number of CVCs for which you want to upload the data keeps on increasing, at some point of time we start getting system time-outs, and then SAP suggests to break the data upload into smaller chunks. So, it's also more useful for lesser volume of data.

Thanks - Pawan