cancel
Showing results for 
Search instead for 
Did you mean: 

Semantic Grouping and DTP Packet Size

0 Kudos

Hello All,

I have a DTP which loads data from DSO to DSO based on Valid customers in a particular week.
There are say 100 unique customers, then I want the number of data packets to be 100.

I used semantic grouping on this customer field, and set my datapacket size to 2k , but still there are chances that 2 unique customers come in single data package - in case Cuctomer 1 = C1 =500 records and Customer 2 = C2 = 1500 records.

Please let me know using what settings I can make sure that a data packet has single customer records (as said in above case semantic key may not always work)

I see Optimal datapacket size set in DTP take preceedence ove semantic key sometimes.

Please suggest any option/settings I can use for this.

Thanks
Sonal

Accepted Solutions (1)

Accepted Solutions (1)

bhaskar_v3
Explorer
0 Kudos

Hi Sonal,

Each Customer can be extracted in single package by below mentioned ways:

1. Add the Sementic key as Customer in DTP and the data package size as 1. So that each customer will be extracted in a single data pacakge .If 1 customer has 10 records and package size is 1 then all this 10 records will be loaded to single Data Package because semenctic key records takes precedence over package size.

2. By using a Planning sequence and writing a fox formule as For Each Customer perform the below logic.

3. Changing the ABAP Code (Based on your requirement) to consider all the customer based on the Loop Condition.

In Second and Third Method the changes needs to be done in all 3 landscapes. So first method is optimal for fast solution.

Kindly check the above solution and let us know for further clarifications.

Regards,

Bhaskar V

Answers (2)

Answers (2)

former_member185132
Active Contributor
0 Kudos

If you set semantic grouping on Customer ID and set the packet size to 1 record, then the system will give you only one customer ID per data packet.

I am curious though, as to why you would need to do this. What business requirement do you have that necessitates such a solution?

0 Kudos

Hi Suhas,


Yes it works with packet size 1. Thank you


We have logic based on customer... It is like for a valid customer if we have data recieved for any combination of materials irrespective of validity we need to see the calculation for all combination of that customer.

If a customer is in single data package then as per the logic this will work completely. If it is split in to two or ore then we will miss that all combinations getting updated to the target.

As there may be chances that for the 1st data packet there 10 combinations of material for a customer C1 and out of all the 10 we have not recieved data for any of these.
In data packet 2 we get same customer C1 with 5 other materials and we have data for one of the material then it will update these 5 customer * materials only...  the customers*materials that are proesent in datapacket 1 will get missed.

We have option for coding to fix this but since it is calculations based on calday , I was looking for a solution directly in P till we get the new code changes to P.

Thanks
Sonal

former_member182346
Active Contributor
0 Kudos

Hi,

What is the requirement which need to make data packet size as per unique customer.

I think it will be only possible by reducing the data packet size, but then it will take additional packet for same customer in case exceeding the number of records,
.

Please explain the requirement so that there can be other possibilities to address it.

Thank-You.

Regards,

VB