cancel
Showing results for 
Search instead for 
Did you mean: 

PI 7.1 JDBC sender adapter huge load from DB select

AndyK
Explorer
0 Kudos

Hi,

We have the problem that the JDBC sender adapter is selecting a huge amount of data from a Oracle DB. The Adapter seems to be able to cope the load but when sending to the integration server it causes problems. We found values to adjust but I dont think increasing them is a good solution.

My question is:

Is there any way to send the data in blocks with using the PI standard JDBC adapter ?

Thank you and regards,

Andreas

Accepted Solutions (1)

Accepted Solutions (1)

naveen_chichili
Active Contributor
0 Kudos

Hi Andreas,

you can do into smaller chunks by select query.....

Example: You have to select 1000 records from the table using sender jdbc. What I would do ten times as follows.

--> In Select statement... select * from table where rownum < 100 and read_record="N" ;

--> in Update Statement update a field in the table like boolean from false to true or yes to No.

Example: Before you configure this scenario, create a column like read_record of type varchar2 or char of size(1) and default value is N. in the update statement , update the column for those 100 records read in the A) step as "Y".

This way you can reread the same table for every 100 records sequentially and make your message size same all the times.

Regards,

Naveen

Answers (3)

Answers (3)

Former Member
0 Kudos

Hi,

Chk this:

/people/dheeraj.kumar5/blog/2010/04/12/pixi-sender-jdbc-select-query-and-update-query-to-limit-our-records-to-be-picked-up-by-xi-from-database-to-avoid-huge-message-processing-failed

/people/peng.shen2/blog/2009/12/23/pi-how-to-handle-high-volume-data-per-jdbc-adapter

Thanks

Amit

baskar_gopalakrishnan2
Active Contributor
0 Kudos

Control your resultset from jdbc sender using SQL Query.

Things to do

1) Write query to return limited number of records (say 50 or 100) and create a flag in the table and read the data based on the flag and select number of records.

2) Update the flag after reading it. SO that next time select statement will not reread the same data.

That is better design. Even if the large volume of data exists in the sender side, you dont need to worry . Because you control message using query. SO once you done, you dont need to change the design for longer time.

Chicillin already presented example above,. We do this design and absolutely no issues for ever.

former_member854360
Active Contributor
0 Kudos

Hi,

You can use the below parameter in JDBC sender.

Specify additional Parameter Names and Parameter Values in the table.

Due to messages of large size it may cause issue

Following are the parameters to be set in the table to limit the message size per polling:

● msgLimit :

● maxMsgSize : SAP note 1253826.

● maxRowSize : SAP note 1253826.

Additional parameters are published in SAP Note 801367.

Refer the link.

Defining Additional Parameters in Advanced mode

http://help.sap.com/saphelp_nw73/helpdata/en/7e/5df96381ec72468a00815dd80f8b63/content.htm