cancel
Showing results for 
Search instead for 
Did you mean: 

how is data loaded from Infocube to Planning area - need technical info

AA3
Participant
0 Kudos

I would like to find out how data is loaded from cube to planning area ,

I know that CUBE has tables, but how does that get loaded to planning area , where is the mapping from which I can compare

data in CUBE and the same data that reaches planning area.

Say for example I have below values in the infocube

Prod1 --> Loc1 -


> available provisioning qty

AAA AB90 100

Then where do i check in the planning area tables ( are there any tables mapped) after running the TSINPUT program, i know it can be checked in the planning book , but i want to check in planning area tables if they exi

Accepted Solutions (1)

Accepted Solutions (1)

syed_hassan2
Explorer
0 Kudos

Use transaction /sapapo/tscube

Identify Cube as data source, Planning area as target, provide dates, and map keyfigures that you will be loading to the Planning Area.

regards,

Najam

AA3
Participant
0 Kudos

I know about the TSCUBE tcode, there is key fig mapping etc, but i would like to know the tables

reason why i ask this is because in our case i have x value in keyfigure in the infocube but in the planning area i get values which are X100 or some cases X200 . I have checked for the add keyfig checkbox also in the variant but we are not using it

any ideas why this is happening

Former Member
0 Kudos

Did you check if there are any extra CVC's in the planning area which don't now exist in the cube? There are no tables you can check since the data in stored in time series livecache.

AA3
Participant
0 Kudos

I would like to browse the planning area data if there is any functional module or transaction code to read the planning area

Former Member
0 Kudos

Hi ,

The data is loaded from infocube to planning area using update rules. The issue you have mentioned seems to be two requests are having data for the same CVCs in the cube.

Example: For the same CVC, two requests avilable in the infocube. When you copy the data using TSCUBE transaction, whatever data avilable in the cube for the CVC gets copied into Planning Area.

CVC1 - cube - Old Request - 100

CVC1 - cube - Actual Request - 200

CVC1 - Planning Area = 300 ( The value is supposed to be 200, but since the old request also contains data in the cube for the cvc, it gets copied as 300)

Issue : There might two request contains data for the same code

Solution : 1. Check the data in the cube using transaction LISTCUBE 2. Delete old request and check the data agian. 3. If it matches with your requirement, run the TSCUBE again.

Please let me know if you need additional information.

Thanks,

Jeysraj

AA3
Participant
0 Kudos

can you explain where i can view the update rules between planning area and infocube

former_member698542
Active Contributor
0 Kudos

Hi,

You can check the update rules (transformation) for a info cube in RSA1 transaction. Go to the info cube (data target or Info providers) and drill down.

The info cube to planning area is always done through load info cube program /SAPAPO/RTSINPUT_CUBE only. In the program variant you define the source info cube and target planning area and you are mapping source key figures from info cube and target key figure to the planning area.

If you have any inconsistency in loading info cube to planning area, then delete the CVCs and create again. Then load again. Check the PSA for existing data loaded again.

If you are loading from planning area to info cube, then check the update rule in RSA1 and check for overwrite, summation etc. Also check the PSA whether existing data is getting loaded additionally into the info cube additionally.

Kindly explain me more about what exactly you are doing. Then I can suggest more on this.

thanks and regards

Murugesan

AA3
Participant
0 Kudos

we get a file from one external system (FILE Datasource) everyday. So the data that reaches infocube daily is fine when compared to the file. But when we use the TSINPUT transaction to load from infocube to planning area, the previous days value of all the Key figs does not get cleared in the planning book and therfore adds up to show a different value when compared to infocube.

we have weekly buckets in the planning book . CVC do not show any inconsistency.

In cube - APO Product - 12345 Qty - 100

APO Location - LOC1

In planning book - Qty shows 200 or some cases shows 400 ( it adds up previous data)

When we run the TSINPUT transaction the output of the program shows certain warnings

AA3
Participant
0 Kudos

I am still not able to understand why you mention update rules and PSA. Update rules (transformations) is supposed to be from source to Infocube.

Our problem is from Infocube to Planning area -


>planning book

(Where are the update rules defined apart from the check boxes in the TSINPUT program variant for source and target)

Former Member
0 Kudos

Hi,

Please chec kyour TSCUBE Variant. In the TSCUBE Variant, Pls click the button Additional Settings. If the 'Ignore Initial value' check box is selected then this is the reason for the issue.

Please uncheck and run the TSCUBE again.

Example:

In a backup InfoCube the key figure for the forecast has no value for Product A, Location 001 for April 2005. However, the planning area does have a value. If you copy data from the InfoCube to the planning area, the corresponding target key figure value will be set to initial, which effectively deletes the existing value. If you select the checkbox, the value will not be set to initial.

Thanks,

Jeysraj

AA3
Participant
0 Kudos

It is not checked in our variant

Former Member
0 Kudos

hI,

I dont know why you are looking PSA , Datasource settings. These BW loading process is no where relevent to your issue since you are loading the data from cube to planning area.

check following thinks

1) At what level you are releashing to planning area ( is it detailed level value or aggregated) and are you looking in to same level in planning book also?

2) check unit of measure used in cube and in planning book display unit measure?

3) craete CVC again and try

Regards

Thennarasu.M

AA3
Participant
0 Kudos

Got reply from SAP for a workaround, they say if you have more number of KEYFIG in the TSINPUT variant with a daily load strategy with lot of data in the CUBE, the variant does not serve its purpose and throws up this error of not clearing previous values in the bucket for new ones.

So resolved by creating multiple variants for the TSINPUT program with few key figs in each variant as compared to more than 40 Keyfigs in a single variant.

SAP says maybe a NOTE will be released soon for this issue.

Former Member
0 Kudos

Hi Ashfaq Ali,

Did u manage to find an answer for this? I came across your thread after browsing through to find an answer somewhat similar to my problem too. The problem that I am facing is that, the job from a process chain meant to delete previous data from a planning area failed. I'm not sure why but if clearing previous month's data failed, it might double up the quantity if I execute the job to load from infocube to the planning area for this month's cycle. So i was trying to figure out how to manually delete data from planning area so that data loading can be done accurately. Kindly suggest me the possible ways. Thank you!

regards,

arryazua

AA3
Participant
0 Kudos

please check you process chain variant for the step where it failed. you will find the program which is being used for deletion of data, this is will best way for you to find out what is being used in exiisting process chain.

  you can also check for the t.code /n/sapapo/rlcdel,

Answers (0)