cancel
Showing results for 
Search instead for 
Did you mean: 

Architecture question

Former Member
0 Kudos

Hello,

I have designed a solution where we will load batch masterdata and material movements (in that order) from R/3. The batches contains several attributes (quantities) that must be available for reporting as key figures. This can be solved by reading the batch masterdata during upload of the material movements into the InfoCube. There is always a 1:1 relationship between the batches and the material movements in our solution.

Unfortunately the solution has one big weakness. What if someone enters a material movement and batch in R/3 after the batch masterdata has been loaded but before the material movements are loaded? The result would be missing key figures in the cube and they would never be added.

Ideally I would like a join between material movements and batches during upload to the cube, where each material movement waits for it's batch before getting loaded into the cube. I can't see how that may be possible. I have briefly considered InfoSets which I have no experience with. However, they seems not to be flexible enough for our usage.

Does anyone have experience with similar issues?

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Christoffer,

You have identified just one problem associated with populating master data attribute value into an InfoProvider using update rules. There are many other problems associated with using this feature of BI.

Makes me wonder why SAP even gave that feature in the first place. What is most saddening is that this is a very common practice in many BI installations.

The only "reliable" solution I have found so far is to just populate these fields in R/3 in a user exit.

Former Member
0 Kudos

Here is another example where populating master data attribute value into an InfoProvider during update rules can bite you.

This master data (say X) has an attribute (say Y). X is loaded daily before any transaction data is loaded. But, it is a delta load. However, a change in value of Y doesn't trigger a delta. (Believe it or not, they had many InfoObjects like this in a place where I had worked for a while. They perform full loads every week to catch up on all misaligned attribute values.)

Now, a new person is hired that doesn't know how X and Y are working. She basically starts populating X-Y values into an InfoProvider rendering the transaction data more or less worthless depending on the importance of value in Y.

Only way (that I can think of) to prevent such problems is to gather all the additional data at the source in a user exit. Everything else is basically unreliable.

Former Member
0 Kudos

Hello and thank you for your reply,

We have now come up with a quite simple solution I believe will work. We will extract the material movements and load them into an ODS. Later we extract the batches. Then we load the material movements into the cube, joining them with the batch data.

Regarding the other issue you describe, we have discussed that previously. Our conclusion is to make sure, using written procedures and dedicated transactions, that all corrections either won't be necessary in BW or will result in two additional pairs of material documents and batches. One for reversing the original values and one with the new values.

Answers (0)