cancel
Showing results for 
Search instead for 
Did you mean: 

How to Handle Huge record count in JDBC

Former Member
0 Kudos

Hi,

Can we process 20 lakhs records which would size near to 136 MB using JDBC sender channel in a single call and the same need to deliver to Target system which is JDBC too.

Kindly mention which way this can be achieved.

Regards

Accepted Solutions (0)

Answers (2)

Answers (2)

monikandan_p
Active Participant
0 Kudos
former_member181985
Active Contributor
0 Kudos

Hi Venkata,

If it is a one time activity, then you can give a try. It all depends on your PI version sizing.

You can also retrieve data base records in chunks to optimize message size.

Regards,

Praveen Gujjeti

Former Member
0 Kudos

Hi Praveen

PI version is 7.0

Currently data is been retrieved in chunks

But the business want entire set of data in single call because its hampering the factory productivity.

iaki_vila
Active Contributor
0 Kudos

Hi Venkata,

Personally, i dont like to work with a huge date and the JDBC adapter, if your client want to process all the data in one time and it's priority dont use the chunk utility i wonder if you can change the architecture and to pass to the endpoint a file, if you dont need a mapping the PI can transfer huge files without ESR development and the performance should be acceptable. The endpoint developers would need a stored procedure to read the file and to process it.

Regards.

anand_shankar10
Active Participant
0 Kudos

With your specific requirement, you can do this and configure message prioritization for the same but only if its atleast one time activity in a day as it will consume a huge memory and impact the performance. And am sure it would take a good amount of time for processing.

Thanks

Anand