cancel
Showing results for 
Search instead for 
Did you mean: 

Data Load from Cube to Planning area taking more time

Former Member
0 Kudos

Hi All,

When we are loading data from Cube to planning area it is running for almost 1 hour and due to which other dependent job are getting delayed.

We have one RTSINPUT step with Parallel Processing profile as 2 PP and Block size 1000 still taking more time.

So we are thinking of splitting RTSINPUT job into two with different selection.

Can anybody comment on this as they will access same cube but different selection so it should not be an issue will loading from Cube to planning area with split job and what about PP?

BR,

Ankit Bassi

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Dear Ankit

Definitely you can use more then one job loading the data from same cube to planning area with different selection. Please also select selections will not be locked option from additional setting tab.

With this option you can run more jobs in parallel mode to reduce the run times.

You can create different PP likes below and play around to see which one helps to reduce the run

times.


Please let me know if this helps you. Please let me know if you require any additional information to me.

Par. Process                         Block sizes

310.000
3PARALLEL_PROFILES100
5PARALLEL_PROFILES500
5PARALLEL_PROFILES2.000
5PARALLEL_PROFILES300
3PARALLEL_PROFILES200
10PARALLEL_PROFILES200

Answers (1)

Answers (1)

alok_jaiswal
Contributor
0 Kudos

Hi Ankit,

Few pointers to consider -

1. Is there increase in number of data being loaded to planning area from Infocube recently or it has remain constant - Please check from job logs.

2. In transaction /SAPAPO/SDP_PARB, you can see for Application "Load Planning Area" default block size is 1000, min is 100 and maximum is 25000. This is as per SAP recommendation

3. You can use two options - first split the selection in two and have them run in parallel with existing 2 PP and see the run time. Second, increase the PP to say 4 and run the same existing job without split and with split too. this will give you good comparison runtime data on which option to select. End result might be the same, only difference would be in runtime. Based on results, you can consider increasing the PP/Block size.

Please do some test in your Dev/Quality systems before making changes in production. To change the PP/Block size - it can be done via /SAPAPO/SDP_PAR

Hope it helps you.

P.S. - In our project we have defined 3 and 6PP with 10,000 block size and depending on requirement/data volume in various steps, it is used accordingly. here obviously 6 gives much better results for us.

Regards,

Alok