cancel
Showing results for 
Search instead for 
Did you mean: 

Self Loop in SAP BI

former_member184624
Participant
0 Kudos

Hi All,

Can you plese provide me some scenarios on Self loop in DSO and Info cubes. In real time, when we have use this concept.

Thanks.

Accepted Solutions (1)

Accepted Solutions (1)

former_member209895
Contributor
0 Kudos

Hi Jalina,

    One good scenario I can think of is a change coming up with the existing reports. Suppose there is a new infoobject (characteristic) required in the report, and this infoobject doesn't exist in the infocube on which this particular report is based on. So through re-modelling we will be able to insert the new info object, and by coding a routine in the transformation we will be able to make sure that any data load which runs hereafter will have the routine populating values for the new infoobject by a look up or some other logic. Now if you would want to have the data for the new infoobject, for all the records which have already been loaded to this infocube(historical data), then you could create a self load on the infocube, incorporate the same population routine in the transformation and then trigger the load to put the infoobject value in the infocube, for the historical data.

Hope this helps,

Regards,

  Manohar. D

Former Member
0 Kudos

Hello Manohar,

A cube IC_C03 with logistic data having more then 20 million records, need to be updated with so material (field SO_MAT already exist) data, need to be read from DSO having common fields sales order and item.

This is updating historical data in the cube, reading data from DSO.

Can you tell me how it can be done, tried using self transformation as it is cube creating new record.

Regards,

Anil V. 

former_member209895
Contributor
0 Kudos

Hi Anil,

    I've worked on such self loops on DSOs. When it comes to infocube I believe we need to create two internal tables in the start routine / end routine say with the structure same as SOURCE_PACKAGE. To one internal table we should append the original record as it is but multiplied by -1 for all its key figures (sign change). To another internal table we need to populate the original record with the looked up value for the new keyfigure / characteristics. Finally we need to modify the source_package from the two internal tables. Or just append the records from these two internal tablees into infocube data. So now the infocube has

a. original record,

b. another record which is the same as original record with all key figures negative, and

c. one more record which is the original record with the new infoobject holding historical value.

The original record will get negated. Give these ideas to an ABAPer and he/she will be able to jot down a performance friendly code in no time.

Again, I'm just providing an idea here, I have not implemented such logic on infocubes. But I'm pretty sure this will work

Hope this helps. Do let us know how you got it to work.

Thanks,

Manohar. D

Former Member
0 Kudos

Thank you, I have 20 million records to be updated, after updating there will be 40 million. the read performance will be very low. 

former_member209895
Contributor
0 Kudos

Hi Anil,

     After updating there will be actually 60 million and not 40 million. (20 original records + 20 negated records + 20 records with historical value for new infoobject). I'm very sure you are running compression on infocube. In which case you have to look for the option of "WITH ZERO ELIMINATION" while compressing, what this does is to eliminate records which have all key figures as zeros post compression. I would suggest you to give it a try on 5k records first, then compress and then check how many records does the E table hold. If you find 5K then we've struck the bull's eye.. 🙂

I'm sure you'll find 5K records.

Manohar. D

former_member209895
Contributor
0 Kudos

Hi Anil,

     I guess Zero Elimination will not work for you buddy. The infocube you are using is a non cumulative cube, I mean it holds non cumulative key figures. Which means that the values never get summed up. Please try without zero elimination compression and try in DEV first for a sample amount of data and hope it works. I believe some experts here looking at this thread might come up with a good plan.

Could you use a multiprovider to get the field SO_MAT value from the DSO based on the Sales Order and Item as common fields ? Please explore on this option as well.

Manohar. D

Former Member
0 Kudos

The infocube is custom cube and it is cumulative.

former_member209895
Contributor
0 Kudos

If you are sure that its a custom cube with no non-cumulative key figures, then I'm sure that zero elimination upon compression will work in this scenario.. Please do not forget to update this thread with the results once you have implemented the suggestions for a sample data.

Answers (1)

Answers (1)

Former Member
0 Kudos

Hi Jalina,

Self loop is required in scenario where you want to check data from target and loop up in other provider and load back in target.

e.g. in one case what i required to do is, i have a DSO1 and DSO2 out of this DSO2 brings delta which need to get loaded to DSO1, but all fields are from DSO1 only. So i have created a transformation and brought delta request in the source package and using expert routine i have looked up in DSO1 itself and populated rest all fields except one field which is comingfrom DSO2, so it is like delta is taken from DSO2, but all data is from DSO1 only.

Regards,

Ganesh