cancel
Showing results for 
Search instead for 
Did you mean: 

JDBC Sender - Large amount of data

former_member184619
Active Contributor
0 Kudos

Hi Folks,

I have a requirement, where we are getting a large amount of data for invoices. Looking for a solution to chunk the data based on below requirement:

1. There are more than 100,000 rows staged on a daily basis.

2. There is no indicator, which suggest that invoice is complete (ECC doesn't accept partial invoice).

3. There is no limit on number of line items in a particular invoice.

4. JDBC adapter max size, max rows would not work, as it will not fetch the data more than defined limit as per sap note service.sap.com/sap/support/notes/1253826.

5. I can't read 1000 rows etc. as not sure, it will have the complete invoice.

I am on PO 7.4 single stack and fetching data from DB2 and sending as an IDOC.

Please suggest. What can be done on database side or PI to chunk data.

Regards,

Sachin Dhingra

Accepted Solutions (1)

Accepted Solutions (1)

former_member184720
Active Contributor
0 Kudos

Are those 100k rows belongs to one Invoice??  If there is no indicator then how are you grouping them currently?


I would suggest you to change the scenario to proxy and handle the invoice creation in ECC.

PI would just pass all the rows without any validation.

former_member184619
Active Contributor
0 Kudos

Thank you Hareesh,

As of now, I was fetching for a division. But that load is also ~100K. At a given point in prdoduction it might fail.

I was planning to go for BPMN, would that be  good option?

Yes, proxy sounds a good idea. But there are couple of challenges identified by ABAP team, as below:

1. They have to store the data in a z table and then read it from there.

2. Exception handling and triggering it again, would be difficult.

Regards,

Sachin Dhingra

former_member184720
Active Contributor
0 Kudos

1. They have to store the data in a z table and then read it from there.

What is the challenge in having a Z table?  Instead of utilizing PI resources constantly(BPM would  be in running state) this should really work well.

2. Exception handling and triggering it again, would be difficult.

What kind of exception are you looking at? You can still do data validations in PI before loading into Ztable right?


Re triggering can always be done from source system if there is any issue with data.


Answers (1)

Answers (1)

iaki_vila
Active Contributor
0 Kudos

Hi everyone,

I think Haresh suggestion is the best because with BPM you won't have any important improvement, you will have the second problem like the ABAPers, and you will include one more point of communication complexity only to do a collection of the data. I think the Proxy-JDBC would be the best option, the ABAP can control with an easy Z table if the have come correctly, even they can do an ALV for the client, if the client want to have a deep control.

If you have a field to difference one invoice set of registers of another, it will not difficult for one ABAPer to develop the calls and to do an automatic retry if it detects any lack.

From my point of view as normal rule (not always) the integration must be the less complex part, letting a better scalability and change facility.

Regards.

former_member184619
Active Contributor
0 Kudos

Hi Inaki,

Thank you for detail analysis.

I was thinking of a BPM design, where I read say 1000-2000 lines and put a correlation to identify the next invoice and keep on sending to ECC. However, not designed the BPMN yet.

Going with ABAP, they have do a exception handling along with interface design. My problem is how to make sure, I am reading a complete invoice.

One another solution was to read a single invoice at a time. But, as the volume is so high, that it will take 5-6 hours to make all the invoices to sap. I have put a mili sec pooling.

Regards,

Sachin Dhingra

former_member184619
Active Contributor
0 Kudos

Hi Folks,

Would it be a good idea to go with JDBC look up as below:

1. Get a set of (20) invoice number by sender JDBC channel and update the flag.

2. Do a JDBC lookup on these 20 invoices to get the details.(I hope, we need to write a UDF to have custom JDBC lookup). Which will create connection again to DB.

But, would it any performance issues or any other downside?

Regards,

Sachin Dhingra

former_member184720
Active Contributor
0 Kudos

I don't think it's a good idea to read all the details "rows"(result sets) for each invoice(20) using a lookup during the mapping.

Instead, i would prefer to handle that calculation/restriction with table joins using store procedure if it cannot be achieved with a single select statement.