cancel
Showing results for 
Search instead for 
Did you mean: 

Mapping - Referencing another segment

Former Member
0 Kudos

Hello,

I have an incoming XML document that I am mapping into an IDOC.

<ASN>

...<Item> 0...999

........<ItemID>

........<PO Reference>

.............<PONum>

.............<POLine>

...<barcode> 0...999

.........<handling Unit data>

.........<delivery item>

.............<itemID>

.............<quantity>

The <item> and <barcode> segments are on the same level. And they can each occur 0..999. So I can have 2 item segments and 10 barcode segments for those two PO/Lines. Or I can have 10 PO/Lines with only 3 barcodes...etc.

Using the matching <itemID> fields I'd like my mapping to fill the barcode segment of the IDOC with all the <barcode> data but also reference the matching <PONum> and <POLine> from the <item> segment.

How can I write a "lookup" that will find the matching <itemID> so we can match the data for the IDOC segment?

Thanks,

Matt

Edited by: Matthew Herbert on Aug 4, 2011 9:49 PM

Accepted Solutions (1)

Accepted Solutions (1)

baskar_gopalakrishnan2
Active Contributor
0 Kudos

Use simple standard functions to implement this..

if you have more occurances... of item and barcode node..

First use remove context then use if without else condition to check itemid and then ifwithout else conditon for PONUM and POLIne then map the barcode xml data to barcode idoc.

Answers (3)

Answers (3)

Former Member
0 Kudos

I wrote a UDF to loop through all the "item" segments for each "barcode" segment. Performance does not appear to be an issue.

Thanks,

Matt

Former Member
0 Kudos

Thanks for the suggestions but this solution only works if the <item> and <barcode> segments are in order. I cannot guarantee that the documents will be sent in a sorted order.

Which means I could get a document like this

<ASN>

...<Item> 0...999

........<ItemID>1

........<PO Reference>

.............<PONum>450000123

.............<POLine>1

...<Item> 0...999

........<ItemID>2

........<PO Reference>

.............<PONum>450000555

.............<POLine>1

...<Item> 0...999

........<ItemID>3

........<PO Reference>

.............<PONum>450000123

.............<POLine>2

...<barcode> 0...999

.........<handling Unit data>

.........<delivery item>

.............<itemID>1

.............<quantity>10

...<barcode> 0...999

.........<handling Unit data>

.........<delivery item>

.............<itemID>2

.............<quantity>50

.........<delivery item>

.............<itemID>3

.............<quantity>44

In the above example the EqualS/IF/Split function solution will fail.

What I really need to is to be able to "lookup" the value in PONum when the barcode <itemID> matches the line <itemID>.

Can this be done?

I can write a Java UDF that will loop through everything in the <line>-<itemID> to get the correct data but if the document has 100 lines and 50 barcodes. I will have to loop through 100 <lines> for each of the 50 barcodes. This could be a performance problem. I'm hoping there is a better solution for this.

Thanks,

Matt

Former Member
0 Kudos

So when I am trying to map PONum into EBELN you are suggesting:

ItemID(from barcode).....Remove Context....

......................................................................EqualS......................IfWithoutElse..........................EBELN

ItemID(from Item)..........Remove Context.....

........................................................................................................(then) PONum

The problem with the above mapping is that it will only have one context for EBELN. In the case when we have one <Item> segment and 5<barcode> segments we end up with only one EBELN created on the IDOC (because there is only one context in the queue).

Any other suggestions?

Thanks,

Matt

former_member194786
Active Contributor
0 Kudos

You can use SpliiByValue function to handle this. Refer to the link below to read more on the standard graphical functions:

http://help.sap.com/saphelp_nwpi711/helpdata/en/21/3bb8c495125e4eb5969f0377885fe0/frameset.htm

Regards,

Sanjeev.