cancel
Showing results for 
Search instead for 
Did you mean: 

Handle Deleted Deliveries

Former Member
0 Kudos

Hi ,

DSO 1 is fed by DSO 2 and DSO 3

DSO 1 Key = Order ,Order Item

DSO 2 Key = Delivery , Delivery Item

DSO 3 Key = Order, order item , Schedule line

DSO 1 has 50 fields suppose , 30 populated by DSO 2 and 20 by DSO 3

Now there is a scenario  If a delivery is deleted, record-mode will be 'R' and this will delete the entire line in the DSO1 including fields populated by DSO3 . If then a new  delivery is created with reference to this order item (deleted)  only new data that comes from DSO2 will be added as a

new line to the DSO1 . All the fields which DSO3 feeds will be empty in that case as no delta will come from it.

This could create serious problems with data consistency.

How to handle this ?

Thanks

Nayab

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi,

your DSO1 Key fields should be combination of DSO2 and DSO3 key fields.

can you post the example with data?

Former Member
0 Kudos

Hi Jyoti

DSO keys are as mentioned in my original post.

Here is 1 example , suppose below delivery is deleted. So it will delete below line from DSO 1.

Now suppose a new delivery D2 is created  for Order O1 and item 10 , so this will flow to DSO 1 and record will look like somewhat like below. This is because no new delta will flow from DSO 3 that populates Order related fields in DSO 1 . This is the problem.

Current data
DSO1
DeliveryItemOrderDelivery QuantityOrder Qty
D110O11010

After new delivery is created post deletion of D1

DSO1
DeliveryItemOrderDelivery QuantityOrder Qty
D210
Former Member
0 Kudos

Hi,

This is a known problem in ECC-BI.   Deliveries can only be deleted from ECC system but can not be cancelled. So the extractor is not able to provide a deleted image for the record. 

Two solution can be there,

1. Full Load of all BI target every day ---this solution is practical only when your data volume is less and full load does not take much time. But if your data volume is huge and flow is complex one then this is not the solution.

2. Second solution is to identify which Delivery Doc - Item combination got deleted from ECC system and then you can just put a flag against those documents. 

You can easily find which doc- item got deleted from CDPOS and CDHDR entry. In our project we created one DataSource based on InfoSet Query ( SQ02) on CDPOS ( item level) and CDHDR ( header level) and pulling all the documents which got deleted in ECC system. For all those Delivery-Item we put a deletion flag and restrict the query accordingly.

in CDPOS following fields are important.

a) CHNGIND: Change Type (U, I, S, D) u2013 This field shows whether the entry is deleted, updated or inserted. We are here interested in the value: D (Deleted).

b) OBJECTCLAS: Object class u2013 This field is used to get the data related to Deliveries, Sales etc. For deliveries, value is LIEFERUNG; For Sales, value is VERKBELEG

c)TABNAME: Table Name u2013 This field contains the table name from which the value is deleted. The tables for Deliveries are LIKP and LIPS. To get the item level data, keep the value as LIPS.

d) TABKEY is 18 characters long field, in which first 3 numbers is the client number, next 10 shows the document number and next 5 shows the item number. E.g.: 750001099999900010, where 750 is client, 0010999999 is document number and 00010 is item number

Also have a look at How to Guide below

Link [CDPOS-CDHDR|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/50dd3fb9-0dad-2d10-0ea7-cbb0bb8b3...]

how to view deleted delivery

Former Member
0 Kudos

Hi Regina,

Thanks for your response.

However contrary to what you mentioned ,our LIS 12 extractor is perfectly delivering deleted images "R" in case a delivery is deleted in ECC

So i think above solution is irrelevant to my problem as there is no problem whatsoever with images/record-mode in case a delivery is deleted.

Thanks

Former Member
0 Kudos

I was trying to help u out..

Thats all..I didnt face this sort of issues...

former_member185132
Active Contributor
0 Kudos

Hi Nayab,

Indeed, you are getting the correct RECORDMODE of 'R' which then causes the deletion of the record in DSO2 and DSO1 and that is exactly what it is designed to do. Now even if a new delivery is created in ECC and sent over to DSO2 and then on to DSO1, the BW system doesn't know that it is supposed to also load the corresponding DSO3 data into DSO1 at that point.

Your expectation is that in future, if a new Delivery gets created in ECC, then it should come via DSO2 -> DSO1 and that it should also cause the DSO3 data to get populated in DSO1. That is not standard behaviour of BW, so to make it happen either the design or the logic has to be changed.

So to help you with this it is important to know the current design and logic

  1. Could you describe the data flow (up to the cube/reporting layer)?
  2. Why are the key fields of DSO1 different from DSO2/DSO3? This suggests there might be some logic in the DSO2->DSO1 and DSO3->DSO1 transformations, so please describe what that logic is

Regards,

Suhas

Former Member
0 Kudos

Hi Suhas, 

Thanks for your response.

Here is the flow.

2LIS_12_VCITM--> DSO 2 (Del Item DSO) --> DSO 1 --> Cube 1

2LIS_11_VASCL--> DSO 3(Order schedule Line DSO) -- >DSO1---> Cube 1

If you look at, DSO 1 ,Order and Order Item are the keys. This is because we feed Cube 1 from this DSO and need reporting to be done based on Order/Order item combination at aggregated level.Besides this we also need to report Delivery KPI's that are registered against a particular order

Therefore we load Delivery Item and order schedule line data in 2 separate DSO (DSO 2 and DSO 3) and combine them in DSO 1 with Order and order item as key, there fore the keys are different.

Logic b/w DSO2->DSO1 and DSO3->DSO1  are simple business logic to filter certain documents based on characteristic value like Document Type etc. There are simple delete statements for this.


Apparently when new delivery is created there is no change in order/item so no delta load happens that causes the problem.

Thanks

former_member185132
Active Contributor
0 Kudos

It looks like this requirement could be met by adding the Delivery,Delivery Item and Schedule Line in the DSO1 key. Reporting can still be at Order level as the aggregation would anyway happen in the cube/query.

In fact your current design probably already has data loss and consistency problems apart from the one you identified. For instance, when a single order item has multiple deliveries or schedule lines (perfectly possible scenarios) the current design loses all but one of the multiple deliveries/SCLs. I doubt that's what you want. So even from this perspective, adding those extra fields to DSO1 would help.

Former Member
0 Kudos

Hi Suhas

I am wondering how adding extra key fields would overcome the current problem i am facing.

Because even then if source deliver a record with image "R" it would delete the entire row in DSO 1.

Changing the design is last option and virtually impossible because we have massive business data in production box. By the way multiple deliveries/SCL's are consistent at the moment because from DSO2--->DSO1  delivery KPI's are additive.Suppose i have 5 deliveries for order O1 /Item 10 ; during data load these 5 deliveries will be aggregated against O1/Item 10 and and show in a single line in DSO 1 eventually.

Thanks

former_member185132
Active Contributor
0 Kudos

Once you have Delivery and SCL as part of the key of DSO1, what would happen is that DSO1 will contain separate records for the data that came in from DSO3 and DSO2. So if an order has a Delivery and a SCL, then there would be two records for this order in DSO1 - one record having Delivery data (from DSO2) and one having SCL data (from DSO3).

Next, when a delivery gets deleted, this results in deletion from DSO2 and also a deletion in DSO1 - however, in DSO1 only the record belonging to the delivery gets deleted. The SCL record (which came in from DSO3) remains as-is in DSO1. So tomorrow when the delivery gets re-created, it lands up in DSO1 again and the DSO3->DSO1 load doesn't need to be repeated. This is logically consistent because the deletion of a delivery should not logically cause SCL data in DSO1 to get wiped out. Also the summarization continues to happen when the data moves to the cube or is reported on.

If you do not want to change DSO1 design, there are alternatives to that

  1. Load the cube directly from DSO2 and DSO3. It would result in similar behaviour as described above
  2. In the DSO2 -> DSO1 transformation, write a lookup end routine that picks up data from DSO3 and populates those fields in the target. Very troublesome given the differences in granularity between the DSOs, and the additive logic will have to be done by the code - you cannot rely on the "additive" behavior of the transformation in this. The complexity doubles if even SCLs can be deleted, because that would necessitate similar logic in the DSO3 -> DSO1 side.

You'll notice that the first is also a design change, and the second will involve a lot of logic. This is unavoidable, because what you're looking for is beyond standard behaviour and as I said earlier, the only way to achieve that is by changing the design or the logic.

Former Member
0 Kudos

Hi Suhas,

Thanks for your wonderful suggestion.Certainly the options you provided are very logical .

I would be please to inform you that i have found a very simple solution rather . What i will do is to convert all the records with record mode "R" to "X" in start routine b/w DSO2-->DSO1. This will selectively reset all the KPI's in DSO1 without disturbing other fields fed from DSO3.

I could arrive at this after ascertaining a wonderful SAP Note on this topic that explains exactly similar problem in a detailed manner. Marking this thread closed now.

Note No: 1358780 - Limitations of DATASTORE loading senarios( Solution C)


Thanks

Answers (0)