cancel
Showing results for 
Search instead for 
Did you mean: 

Problem during extraction SNP planning are data to BW

Former Member
0 Kudos

Hello Experts,

We have performance problem in extracting planning area data to BW system.

Data are splitted into data packages of small sizes. Does anyone have any idea why packet size is small.

I checked with BW team, they are telling there is no change in settings for size of Data package.

Recently we deinitialized and reinitilazed SNP planning area, only after this reninitialization we have this problem.

Any suggesion are welcom.

Thanks in Advance.

Regds,

Srini

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi Srini,

Please check whether all time series data are restored during reinitialisation. Your observation of small data size shows there could have been some data not being restored during reinitialisation of planning area.

Moreover, if future buckets did not have enough data, that also can cause small datapacket size.

Regards

R. Senthil Mareeswaran.

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi Srinivas,

You can set the packet size a infopackage handles in the infpackage. Go to "Scheduler" in the menu inthe top left hand side corner, select "Data S.Default data transfer". A window pops up and you can enter a default packet size that the infoPack handle. We set it to 50,000.

When you say "Data are splitted into data packages of small sizes", where did you check this??

Check the above and compare packet sizes.

The number of records extracted from SNP = No. of CVCs * No. of planning buckets for which the planning area is initialized. This usually is in millions. This does not chnage whether you extract 2 or 10 keyfigures.

When you say your volume is multiplied by 5 times, was there a increase in the initialization horizon??

The other few good options for you when you are extracting data are:

In /sapapo/msdp_admin, go to extras>Data extraction tools, select your datasource and click generate datasource

1. Select the check box " No extraction of Data Records without a keyfigure value". this is will reduce the no. of records loaded to backup as it does not consider periods with no value.

2. Use a parallel processing profile.

Did I get your point right?

Former Member
0 Kudos

Hi Vishu,

Thanks for your reply,

Fo checking paket size, we checked in our BW system and also Job Log of the extraction Job in APO system,I compared job logs before and ater reinitializaion of PA.

No, Volume of records are not multiplied. Only number of data packages are increased and in these data packages we used to get records in 1000's, but now its 100's.

    • Regarding keyfigure values without values, we have already done this.

    • Date range for planning area initializaion is same as before.

Please let me know if you need further clariffication.

Thanks,

Regrds,

Srini

Former Member
0 Kudos

Hi Senthil,

Thanks for your quick response.

We have only 2 time series keyfigure in our SNP planning area and we are using macros and some userfunction to display these two keyfigure values in Planning book. So,I guess there is no need to restore these two keyfigures and all other are order series keyfigure.

Bckground : Before we used remote cube concept to display history from DP planing area in SNP planning books, Now we removed this and developed a function module and a macro to fetch DP data and display the same in these two keyfigure in SNP planning area.

After reinitializing PA number of data packets are increased by 5 times, Inline with this runtime has also increased by 5 times.

One question:

Do you know why we get smaller packet if future buckets did not have enough data and also if some data is not restored.

Could you please explain me how to check full data is restored or not.

Thanks in Advance.

Regards,

Srini

Edited by: Srinivasa E on Mar 3, 2009 6:44 PM