cancel
Showing results for 
Search instead for 
Did you mean: 

File (huge size of data) to File with a lookup to ECC

Former Member
0 Kudos

Hi,

I have a requirement where a batch file containing 3000 records is to be picked .

For each record there are two table lookups i.e 6000 lookups to ecc for a single file.This will affect the the performance of PI .

I am thinking of a synchronous proxy scenario where we will pick file and send a identifier for each record to ECC via proxy.

Based on the identifier,  ECC can send the required data for transformation for each record in a single proxy back to PI.

Then using the initial file we received transform it using values got from ecc and send a file to target.

I dont know as of now how feasible this design would be.

Please suggest any much more feasible and simpler  design solutions.

Thanks,

Vinayaka Akkasali.

Accepted Solutions (0)

Answers (2)

Answers (2)

ambrish_mishra
Active Contributor
0 Kudos

Hi Vinayaka,

You can pick the batch file, do an RFC look up by sending the data as a table to ECC(single call to ECC) and get the data back. I think the queues will be automatically managed. Subsequently you can create a file.

However, to improve the performance you can try creating unique data values to ECC and then populate data based on keys.

For example, if out of 3000 records, 3000 source values have eventually 500 unique values and rest are occurring more than once, send the unique values (500) through the RFC call while creating the header node, get the data results and then map key value pairs during field population.

Depends upon your requirement and data expectations from ECC.

NB: 3000 records is not much but volume of data (number of fields) is important.

Hope it helps!

Ambrish

allamudi_loordh
Active Participant
0 Kudos

Hi ,

What Ambrish said correct..there will be not all 3000 unique values.. try import all into your PI mapping in the first call itself.. use Global Container .. next step do look up here in mapping itself.

u many need to write some udf for this.

Regards,

Loordh.

baskar_gopalakrishnan2
Active Contributor
0 Kudos

I would recommend to split the scenario into two.  First scenario is file to proxy (async).  Instead doing two lookups for each record and floating identifier back and forth, I would update the relevant data to identifier logic in the proxy level.  Once all the data is written for each record, create file and place it.
Another simple (pass through file to file scenario without mapping) to send file to the target system. If you dont want to run the second file adapter to rio wait for all the time. You can use proxy  or some batch job mechanism to externally control the file sender adapter(second scenario) to start polling and stopping.

Former Member
0 Kudos

Hi Baskar,

" If you dont want to run the second file adapter to rio wait for all the time. You can use proxy  or some batch job mechanism to externally control the file sender adapter(second scenario) to start polling and stopping."

Can you please elaborate on the above lines.

Thanks,

Vinayaka Akkasali.