cancel
Showing results for 
Search instead for 
Did you mean: 

JDBC Adapter (Sender) with Huge amount of Records...

Former Member
0 Kudos

Hi Experts,

I'm looking for solution to move data from One Database to Another database, in this case, in my understand, we can use JDBC adapter as sender and receiver.

But my problems here is, in database table, there is CO table which contain more than 20,000,000 records. And it need to update every time data have been changed. My point is, can JDBC handle this huge amount of data.

If not, can we split it? or are there any solution instead of this JDBC Adpater?

Thanks in advance,

terry_n

Accepted Solutions (0)

Answers (3)

Answers (3)

juan_vasquez2
Active Participant
0 Kudos

You can use stored procedure

then you can control the maximum number of rows to retunrn as a constant value, or you can use a parameter in the procedure to specify this maximum value.

Procedure getMyData(par1, par2....,maxRows)....

regards

Juan V

former_member185881
Active Participant
0 Kudos

Hi

Its very simple if you know the SQL or oracle.

Whatever is you database just apply TOP query in your select query and update query in your sender JDBC adapter.

If you dont know the query just search in google.

What TOP Query do:

for example ( SQL Server ) i have to pic only 10000 records from the 50000 records present in table.

Here i will apply ---

Select TOP 10000 fieldName1, fieldName2 from TableName where clause...

Update TOP(10000) TableName set fieldName1='1' where clause...

Use this query and your problem will be solved.

Please dont forget to make it Transaction Level Isolation = serializable (under Advanced --> Additional Parameters in your sender Communication channel)

Hope so it will solve your problem.

Regards

Dheeraj Kumar

Former Member
0 Kudos

Thanks for this great Idea, this might be the solution!...

Thanks...

former_member185881
Active Participant
0 Kudos

Hi

You can refer my blog also

/people/dheeraj.kumar5/blog/2010/04/12/pixi-sender-jdbc-select-query-and-update-query-to-limit-our-records-to-be-picked-up-by-xi-from-database-to-avoid-huge-message-processing-failed

Regards

Dheeraj Kumar

Former Member
0 Kudos

Hi All,

I have the same issue where we need t fetch 6 million rerecords from SQL DB and insert into ECC via Proxy. The issues is SQL DBA does not want to provide the flag for updating the records ( through which we can split the data)

So fetching so many records in one go is one issue. Other issues is how many messages will be generated in SAP PI( going to to ECC) .. Will there be 6 millions? I guessing using multi map, we can restrict it but overall design looks shaky.. Can you advice the best way forward.

Thanks!

rajasekhar_reddy14
Active Contributor
0 Kudos

Hi,

This is the common problwm we face with JDBC adapter,i think spliting the data in to chunks and process is the best solution.

search in sdn,similar discusion on the same concept u will find it.

Cheers,

Raj

Former Member
0 Kudos

Hi Raj,

Thanks for your answer, I will try to Split it some how. Thanks...

More

I found this thread, it might help...

Cheers,

Terry_n

abhay_rajhans2
Contributor
0 Kudos

Hi Terry_n,

You can also try one thing. When you read from sender database update one field of the database with flag as 'X'. Next time when database is updated ask the database handler to remove the flag x from the records which are updated also ask them that while inserting data in that table donot pass flag as 'X'. So when ever there is new entry or updation in the table XI will only pick the records which are having flag as blank. This will reduce the performance issue.

In sender JDBC adapter you can specify this condition by writing select query like select * from table where abc (flag field ) is null. and in update field write query for update saying where abc is null update as 'X'.

JaganMohan
Participant
0 Kudos

Dear Terry_n,

Have a look into this blog so that you may get some details related to this issue.

/people/peng.shen2/blog/2009/12/23/pi-how-to-handle-high-volume-data-per-jdbc-adapter

Regards,

JP.

Former Member
0 Kudos

Hi Rajhans,

This is very good idea from what you suggest me. But one thing I had concerned, at the very first time of selecting this table, It will consume lot lot of time. But I will try, big thanks to you and everybody...

Cheers,

Terry_n

former_member568822
Active Participant
0 Kudos

Hi,

Try not to do the data conversion using JDBC as it consumer many java memory and may causes java stack down.