cancel
Showing results for 
Search instead for 
Did you mean: 

Jdbc

Former Member
0 Kudos

Hi All,

I need help in writing a UDF to append 20 arrays of values.

I want a UDF1 which appends 5 values ,which I pass to second UDF which takes the 5 values from UDF and appends another set of next values.

so it is like cascading effect.

can u help?

Thanks

Accepted Solutions (1)

Accepted Solutions (1)

justin_santhanam
Active Contributor
0 Kudos

Hi,

This can be done, if you give the structure and the sample data, it could be easy for us to suggest.

Best regards,

raj.

Answers (7)

Answers (7)

Former Member
0 Kudos

It will probably do the job well, the only thing that concerns me is that when I run more than one queue UDF in my mappings I run into issues of elements not staying in the right context. I'm going to try and do a little more mapping with this some time, it's been a while since I tried it, so no worries. Thanks Raj.

Former Member
0 Kudos

Hi Paul,

I am increasing your points since I feel you are following up on this problem.

Thanks

Former Member
0 Kudos

Hey raj or sap xis, I would like to know how you had the problem solved. I'm just interested in the route taken.

justin_santhanam
Active Contributor
0 Kudos

Paul,

Please find the below URL's that I suggested. Let me know ur concerns,thoguhts regarding this.

http://flickr.com/photo_zoom.gne?id=1062275408&size=o

http://flickr.com/photo_zoom.gne?id=1062275356&size=o

http://flickr.com/photo_zoom.gne?id=1062275340&size=o

Best regards,

raj.

Former Member
0 Kudos

here's a more efficient algorythm

String final="";

for(int i=1;i<=a.length;i++)

{

final+=a[ i ] ;

if(i%5==0)

{

final+=recorddelimiter;

}

else

{

final+=fielddelimiter;

}

count++;

}

result.addValue(final);

Former Member
0 Kudos

Raj : Thanks for solving the problem.

Paul: Thanks for being proactively involved.

giving both of you the points.

Thanks again

Former Member
0 Kudos

Hmmm, that's probably true... I hope Raj has something for you then.

It'd seem to me though, that running it through two UDFs would make it more congested from a speed standpoint especially since you'll be using Advanced UDFs both times pulling in the contexts.

Former Member
0 Kudos

I still think the beast way would be to do it with one UDF so you don't get the queuing issues.

Sourcenode -> removeContexts-> UDFCombine -


> DestinationNode

in the node use this code

String final="";

int grab=0;

while(grab<a.length)

{

for(int j=0;j<5;j++)

{

final+=a[grab];

if(j<4)

{

final+=fielddelimiter;

}

grab++;

}

final+=recorddelimiter;

}

result.addValue(final);

Former Member
0 Kudos

Issue is in passing a large number of values to a user defined function.

before passing i am mapping accoriding to a logic.

so it looks so much congested.

Thanks

Former Member
0 Kudos

If you have a document structure like this

Root

--next level (5 structures of this)

-


next level (20 elements of this)

I have an easy solution.

Former Member
0 Kudos

Hi ,

source is a material master IDOC.

target is a stored procdure.

I have done all the mappings.

But now stored procdure needs all the values appended together and sent as a string with a record delimiter of | and field delimiter of ,

I dont want one append function since there is a complex mapping for each field. so I cannot pass all mapped values to Append udf in one go.

For this reason I need 2-3 smilar UDFs : first udf could append 5 values and this culd be passed to second udf which appends the result from first udf to second set of values.

Thanks

Former Member
0 Kudos

The mapping was working fine when I was making calls to store procedure before.

but for each cross reference segment(repeating) a call had to be made to stored procedure.

this led to an overhead of connecting to stored procedure many times.

but now I have to append all together in one string and only one call to stored procedure.

Ask me more questions if you have more doubts

Former Member
0 Kudos

I'd like to see the structure of these documents. I can probably help you if I understand this better.