cancel
Showing results for 
Search instead for 
Did you mean: 

How to Perform DP Disaggregation through non standard means??

Former Member
0 Kudos

Hello all,

Have a new requirement on our project here that we're trying to solve and I'm hoping the community could provide some insights?

The requirement is to provide an excel based data upload mechanism to one of our DP planning books, one that isn't memory constrained like the standard download/upload mechanism provided out of the box. 

The business scenario is that our DP planners receive various planning adjustments (which correlate to specific key figures in the book) for all the CVCs in their portfolios.  This data can sometimes exceed thousands of records, and rather than keying them in by hand they would like a way to upload them from excel.

Now, the standard planning book download/upload mechanism would work, except for the fact that drilling down on product and customer within the book causes severe performance / memory issues and the user's session essentially crashes.  So we need to design a way to upload this data into the system via another way.

The idea so far is to create an excel upload utility and wrap it in a web based dynpro.  The utility would prompt the user for the CVC characteristics that describe the data they are uploading, and what key figure they are uploading to, as well as what time buckets the data is valid for... very similar to using the shuffler and navigating within the planning book to produce your down load template if one wanted to use standard functionality.

Once the selections are made, and the data is uploaded to a custom back end table for processing, we need to somehow load it into the planning area and perform the standard disaggregation; and this is where my questions lay. 

How does one load data from a DSO into the planning area, and then perform disaggregation on that data?

thank you in advance for the help and will award point and kudos for any assistance given.

Best,

Christopher

Accepted Solutions (1)

Accepted Solutions (1)

rico_frenzel
Advisor
Advisor
0 Kudos

Hello,

exactly. Disaggregations is technically nothing else than stored procedures  (written and compiled in C++). They are shipped with the livecahce releases. There is no way to develop an own disaggregation.

The only solution is to build a z-program around that reads / writes the data on the more detailed levels and performs the distribution of the values in your own program.

Best regards

Rico Frenzel

Former Member
0 Kudos

Hi Rico,

Thanks for the reply!  I'm trying to figure out if i need to build a custom z-program can call it in a BW transformation, or if loading data to the planning area using TSCUBE will do the disaggregation. 

The externally sourced data will likely be at the Customer / Product level... think customer forecast... but my CVC structure is much more granular than that.

My CVC structure is 7 levels, Product, Commercial Product Code, PPG, Product Family, Plant, Corp. Customer, Planning Customer

So if my data is at the Planning Customer / Product level in a cube and I load it to the planning area using TSCUBE, will it disaggregate 'automatically' using the standard stored procedures you mention?

Thanks!

C.

Former Member
0 Kudos

You need ie. a cube with the same characteristics as the data input by the user (customer and product), you don't need more than that.

Put your data there and map the matching characteristics with MPOS in TSCUBE. Use it to copy data to timeseries at THAT level ("grouping condition"), dissagregation will populate missing levels automatically.

In bw you write only at det. level, in DP you write at any level, dissagregation filling the holes "by itself" when system saves data.

Thanks,

J.

Answers (2)

Answers (2)

Former Member
0 Kudos

Hello,

Agree completely to James S.A. just wanted to add that you can also set how your dis aggregations are performed based on the other Key figure in planning area

.

All that you have to do is that Go to /sapapo/msdp_admin and change the settings for the keyfigure as dis aggregation based on another Key figure and give the basis keyfigure.

Hope this helps you a little bit.

For data upload it's better if post all your requirements in

http://scn.sap.com/community/data-warehousing

regards,

Anurag

Former Member
0 Kudos

Hi Anurag,

Thanks for the reply!  Please see my reply / further question to James above... I'm trying to figure out if using TSCUBE to load externally sourced data from a cube, which is at a summary level (i.e. Customer & Product) will invoke the ‘standard disaggregation’ or if I have to load the data to the planning area at the lowest level.

Thanks

C.

Former Member
0 Kudos

Hi,

about your two questions:

- dissagregation: I believe it is easier than all that. Once you solve your first problem (upload to DP), dissagregation will happen automatically and almost instantly and you almost can't prevent it. When you upload to an "aggregated level" (say SKU level), the system will automatically dissagregate down to the most detailed level (say SKU-location-customer-country-etc...). You can influence this behaviour by customizing which dissagregation you want for the key figure you are using (against itelf, against another key figure, etc...). But unless you explicitely configure your keyfigure to "no aggregation", it will always act just because the system needs to reach the detailed level to store the values.

- data upload: you will need somebody with average BW knowledge: liveCache will behave like a cube/ODS and "all" you have to do is configure the usual BW things, from source to target. This is an issue that appears regularely on this forum (ie. http://scn.sap.com/thread/1077291, for SNP but should work equally well). I am no BW expert, so I cannot solve you this part.

thanks,

J.

Former Member
0 Kudos

Hey James thanks for the reply.  It’s totally possible that I might be over thinking this one… 

I guess the crux of my question is around the 'automatic dissagregation' and how it actually happens. 

I know that when a planner transacts through the planning book the dissagregation happens as standard functionality and if data is entered at a higher CVC level it gets blown down to the lowest level of granularity using the disag key figure settings in the planning area (MSDP_ADMIN).  So that’s standard functionality… check… but what if I load data from an external source (i.e. my data upload utility) and don't interact with the planning book?

 

I can envision getting this external data into a DSO and pipe it through to a cube using standard BW data modeling (I’m a BW guy so I’m pretty confident on this bit)… and I can see using TSCUBE to load the newly loaded data from the cube to the planning area but here’s where I’m stuck… what granularity level does the cube data need to be in, in order to load it to the planning area properly? 

If my data is at a PPG level in the cube, will it disaggregate ‘automatically’ when I load it to the planning area using TSCUBE? 

Or… do I need to do what Rico is alluding to below, and recreate the disaggregation engine in my BW transformations to get summary level data down to the lowest level before loading to the planning area?

I don’t want to recreate the wheel here, but don’t know how to call or leverage the automatic disaggregation process that occurs when interacting with the planning book.

Hope that all makes sense and really really appreciate everyone’s thoughtful input and comments.

Thanks again

Christopher