Skip to Content

Archived discussions are read-only. Learn more about SAP Q&A

How to Perform DP Disaggregation through non standard means??

Hello all,

Have a new requirement on our project here that we're trying to solve and I'm hoping the community could provide some insights?

The requirement is to provide an excel based data upload mechanism to one of our DP planning books, one that isn't memory constrained like the standard download/upload mechanism provided out of the box. 

The business scenario is that our DP planners receive various planning adjustments (which correlate to specific key figures in the book) for all the CVCs in their portfolios.  This data can sometimes exceed thousands of records, and rather than keying them in by hand they would like a way to upload them from excel.

Now, the standard planning book download/upload mechanism would work, except for the fact that drilling down on product and customer within the book causes severe performance / memory issues and the user's session essentially crashes.  So we need to design a way to upload this data into the system via another way.

The idea so far is to create an excel upload utility and wrap it in a web based dynpro.  The utility would prompt the user for the CVC characteristics that describe the data they are uploading, and what key figure they are uploading to, as well as what time buckets the data is valid for... very similar to using the shuffler and navigating within the planning book to produce your down load template if one wanted to use standard functionality.

Once the selections are made, and the data is uploaded to a custom back end table for processing, we need to somehow load it into the planning area and perform the standard disaggregation; and this is where my questions lay. 

How does one load data from a DSO into the planning area, and then perform disaggregation on that data?

thank you in advance for the help and will award point and kudos for any assistance given.

Best,

Christopher

Tags:
Former Member replied

You need ie. a cube with the same characteristics as the data input by the user (customer and product), you don't need more than that.

Put your data there and map the matching characteristics with MPOS in TSCUBE. Use it to copy data to timeseries at THAT level ("grouping condition"), dissagregation will populate missing levels automatically.

In bw you write only at det. level, in DP you write at any level, dissagregation filling the holes "by itself" when system saves data.

Thanks,

J.

0 View this answer in context
Not what you were looking for? View more on this topic or Ask a question