cancel
Showing results for 
Search instead for 
Did you mean: 

Disagregation in Demand Planning

Former Member
0 Kudos

Hi Experts,

I'm trying to make my monthly forecast disagregate into weekly buckets that are based on a 5 day week (Monday to Friday) not 7 days. I do have the disagregation working but it appears to be based on the 7 days.

I've know this as I've removed demand from Feb 2014 and when it disagregates I see a small demand in week 9 which I presume is from March 1st & 2nd (Saturday & Sunday).


The planning area has weeks & months in the Storage Bucket Profile.

The Time Stream ID is set to working time calenda (with gaps). A period split profile has been created from months to weeks and the distribution function is created with 7 periods which have the last two as blank. I don't have a seperate planning book for the weekly view, only a different data view.

KF calculation type is S and Time  Dis. is P

Can the monthly demand be split into weeks using a 5 day week?

What else do I need to set up to achieve this?

Any help is appreciated. Thanks

Accepted Solutions (1)

Accepted Solutions (1)

former_member187488
Active Contributor
0 Kudos

Hi, not sure which scenario you're using:
- If you're using DP monthly planning book to input forecast into monthly bucket, and the disaggregation is performed in liveCache, please refer to note 1902413.
- If you're releasing forecast from DP to SNP, you should check:
  1) In DP planning book, you load the data, and change display time bucket profile to "storage bucket profile" by pushing the small button look like a calendar on left top of the grid. This will display data in technical periods (see note 737230 - you must know data is also saved in weekly though you do not have weekly view.). First check here whether you already have forecasts in weekends (March 1st & 2nd) already. If yes, still you need to refer to note 1902413.
  2) If no problem from DP side, you need to consider your way of relese. First question is how data is read from liveCache. Generally it is read from technical period in you use /sapapo/MC90. But if you set "planning bukcet profile" here, it will read with planning bucket profile. If you use DP mass processing, it will read from the time bucket profile of the data view. (See note 403050.)
  3) Then how is the data splitted after read from liveCache and before saved to SNP. Since you're using period split profile, please check the result it in /sapapo/sdp_split.


Former Member
0 Kudos

Thanks Ada, very imformative.

We're forecasting in monthly buckets and disagregating in live cache. I've viewed the storage bucket profiles with /SAPAPO/OM_TS_BROWSER and can see that each day has a weighting.

Can these weightings be reset or over ridden withsomething else?

What we're want is the monthly forecsat to disaggregate into weekly buckets, then transfer the weekly forecast to ERP. But when it disagregates we want it to igonore Sat & Sun and only disgaregate into Monday to Friday of the working calendar (ignoring holidays).

Can you please sugest the best way to achieve this.

thanks

Paul

former_member187488
Active Contributor
0 Kudos

Hi Paul, so the issue lies in DP side ... As per the note 1902413, "The internal factors can not be changed. They are calculated during initialization of the planning area ...", you'll need to de- and re-initialize your planning area with data backup and data restore ... no one likes to do that

Here you still have an option if you use /sapapo/mc90. You assign a "planning bucket profile", which should be a time bucket profile with monthly period only. Then you assign your period splitting profile.
If you use DP mass processing, it would be more easier -- you just use a montly week.
Which scenario are you using? Which transaction do you use to transfer the forecasts to ECC?

Best way would be use DP mass processing, I guess ...

Former Member
0 Kudos

Hi Ada, I've tried to re-intialize the planning area in our Dev system but it still keeps the same internal factors for the time series. Looks like we are stuck with these.

Is there any other way to over-ride / change / replace the internal factors?

Our data transfer to ECC is with a transfer profile/mass processing. I've tried to use a period split here but still not too successful. It does split, just not how we expect it using the Period Split and Distribution Function.

Period split: 10 Monthly periods (start) split to Weeks (traget periods) and with Distribution Function. The Distribution Function we have 12 periods all with the same value.

I don't quite understand how the Distribution Function period values work except that it should split evenly.

Any guidance you can provide would be appreciated.

thanks

Paul

former_member187488
Active Contributor
0 Kudos

Hi Paul, you must de-initialize the planning version first and them re-initialize.
What I'm confused is, you want to split into weeks, why are you concerning about the value in Saturday and Sunday? Please kindly explain with an sample ...

Comes to period split ... it's really not easy to use, and the calculation is complicated ... just consider the fact that we don't always have the same number of weeks in a certain number of months ... I have went into this logic deeply, but explaning it would be a nightmare ...

Former Member
0 Kudos

Hi Ada, by de-initialize you do mean deleting the PA's time series objects (which I did) and create the TSO's again. This didn't change the internal factors and there was only a single version attached to the PA.

An example. If I add demand into June in Demand planning and transfer this to ECC with the split profile, the first demand appears in ECC on the 26.05.2014. Although small, it creates confusion. I do understand that June 1 is in week 22, which is the same week that 26 May is in and this is the reason the demand appears.

If you then try and compare the monthly time buckets in ECC with DP there is a variance because of this.

What we're trying to achieve is the first demand for this month should be on 3rd June (2nd is a public holiday for us) and all of the demand to be split into weeks 23, 24, 25, 26 and 27 according to the work days in each of these weeks, not the calendar days. So if our demand was 20, it would split 4, 5, 5, 5, 1.

To may sure I understand correctly:

Internal factors are used to calculate the disagregation in DP. So if you use a data view in weeks to transfer the forecast it calculates on the internal factors which we can't change.

The period split attached to the transfer profile and is driven from what is set up in the period split and distribution function. This is not influenced by the internal factors.

Regards

Paul

former_member187488
Active Contributor
0 Kudos

Hi Paul, about the disaggregation factors, have you assigned time series to your storage bucket profile? Is this time series correctly set in /sapapo/calendar?

Former Member
0 Kudos

Hi Ada,

An important piece of info which I have left out. Yes we have a Time Stream ID assigned but up until a few weeks agao the time stream with the "Period Calendar" (No Gaps) but we have changed this to the "Working Time Calendar" (with gaps).

We believe that the time series is correct as it only shows the working days (Mon to Fri) and this also excludes our public holidays.

The issues is te change from the 7 day calendar to the 5 day calendar and how SAP still retains the information based on the 7 days.

Regards

Paul

former_member187488
Active Contributor
0 Kudos

Hi Paul, please check the factory calendar attached to your time stream ... If the period factors are still not correct ... it's strange ... maybe you can attach the factory calendar, time stream calendar and the disaggregation factors ...

Former Member
0 Kudos

Hi Ada,

The "Planning" calendar  has the calculation rule with the 5 weekdays. And displaying the periods there are 5 days blocks and then 2 days missing from the calendar as shown.

And this is the same calendar (Time Stream ID) attached to the Storage Bucket Profile used in the PA that I have tried to updated.

So every attempt I've made to delete the system's storage bucket profiles has been unsuccessful.

Can you offer any further suggestions?

Thanks

Paul

Answers (0)