cancel
Showing results for 
Search instead for 
Did you mean: 

Need to post Full Load data (55,000 records) to the target system.

Former Member
0 Kudos

Hi All,

We are getting the data from SAP HR system and we need to post this data to the partner system. So we configured Proxy(SAP) to File(Partner) scenario. We need to append the data of each message to the target file. Scince this is a very critical interface, we have used the dedicated queues. The scenario is working fine in D. When the interface transported to Q, they tested this interface with full load i.e with 55,000 messages.All messages are processed successfully in Integration Engine and to process in Adapter engine, it took nearly 37 hrs. We need to post all 55,000 records with in 2 hrs.

The design of this interface is simple. We have used direct mapping and the size of each message is 1 KB. But need to append all messages to one file at the target side.We are using Advantco sFTP as receiver adapter and proxy as a sender.

Could you please suggest a solution to process all 55,000 messages with in 2hrs.

Thanks,

Soumya.

Accepted Solutions (1)

Accepted Solutions (1)

former_member208856
Active Contributor
0 Kudos

How you are sending data from Proxy side ?

it should be fill in a single packet (50000 records), if it is one by one record, it will give you problem.

receive single packet from proxy & pass the same to File.

Former Member
0 Kudos

Hi Sandeep,

Thanks for the reply.. From SAP, they will a trigger 55,000 rmessages.Each message has only 1 record(<1KB). One approach is to collect some no of messages from proxy in PI and send to the target side.. For this, we need to implement BPM. Is there any other approach to achive this with out using BPM.

Thanks,

Soumya.

former_member208856
Active Contributor
0 Kudos

BPM is not a good approach for collecting Huge proxy data in PI.

you should contact to ABAP Consultant , who is writing code for sender side Proxy.

Tell him to collect message there in proxy code at ABAP level & send that to PI.

Answers (3)

Answers (3)

RaghuVamseedhar
Active Contributor
0 Kudos

Hi Soumya,

I understand your scenario as, HR data has be send to third party system once in a day. I guess, they are synchronizing employee (55,000) data in third party system with SAP HR data, daily.

I would design this scenario as follows:-

I will ask ABAPer to write a ABAP program, which run at 12:00, pickup 55,000 records from SAP HR tables and place them in one file. That file will be placed in SAP HR file system (you can see it using al11). At 12:30, PI File channel will pick up the file and transfer the file to third party target system as it is, without any transformation. File to File, pass through scenario (no ESR objects). Now, ask the target system to take the file, run their program (they should have some SQL routines). That SQL program will insert these records into target system tables.

If 55,000 records make huge file at SAP HR system, ask ABAPer to split it into parts. PI will pick them in sequence based on file name.

In this approach, I would ask both SAP HR (sender) and third party (target) system people to be flexible. Otherwise, I would say, it is not technically possible with current PI resources. In my opinion, PI system is middleware, not system in which huge computations can be done. If messages are coming from different systems, then collection in middleware makes sense. In your case, collecting large number of messages from single system, at high frequency is not advisable.

If third party target system people are not flexible, then go for File to JDBC scenario. Ask SAP HR ABAPer to split input file into more number of files (10-15, you PI system should be able to handle). At receiver JDBC, use native SQL. You need java mapping to construct, SQL statements in PI. Donu2019t convert flat file to JDBC XML structure, in your case PI cannot handle huge XML payload.

You have to note, hardware upgrade is very difficult (you need lot of approvals depending your client process) and very costly. In my experience hardware upgrade will take 2-3 months.

Regards,

Raghu_Vamsee

rajasekhar_reddy14
Active Contributor
0 Kudos

Hi,

i feel design of interface is not correct, better to handle appending stuff in ABAP proxy level it self,or try to append 1000 records in to single message in ABAP proxy, so that burdern on Adapter engine would reduced.

internally evry adapter can process 5 messages at a time,most of the standard adapters thread allocation was 5.

and one more thing definetly AE perfromce not good in your case, better to take help from Basis team to tune performac of AE.

Reagrds,

Raj

Shabarish_Nair
Active Contributor
0 Kudos

another factor could be the server sizing.

Is the QA server sized enough to handle such a load? If not, the testing is not realistic.

Also have you tried using message packaging?

Former Member
0 Kudos

Hi Vijay,

We are planning to implement message packaging for this. But after doing this, Is processing time will come to 2 hrs from 37 Hrs..?

Thanks,

Soumya.

Shabarish_Nair
Active Contributor
0 Kudos

an ideal design if you ask me will be;

1. ABAP proxy should not send out all messages at one time. Use a loop and send a set of messages say 5000 per trigger.

2. Use message packaging - this ideally should give you a better performance

3. If sequencing is important, then trigger the ABAP proxy in EOIO mode - /people/arulraja.ma/blog/2006/08/18/xi-reliable-messaging-150-eoio-in-abap-proxies

Former Member
0 Kudos

Hi,

Thanks a lot for all your replies.

Right now we are using sFTP adapter at the receiver side. Instead of sFTP ,If we connect to database at the target side and tryng to post the data into one DB table, is there any performance improvement with JDBC channel instead of sFTP..? Please suggest.

Thanks,

Soumya.

rajasekhar_reddy14
Active Contributor
0 Kudos

Hi Soumya,

it is completely depend on how you are going to insert data in to table at one trasaction or multiple.,that means to insert 55k records if your going to perform 55 insert staements def it will take time ,but not more than one day.

JDBC max concureencey is 5, using JDBC is one right option,but once again it depends on your server.

Regards,

Raj

Former Member
0 Kudos

Hi Raj,

Thanks for the quick reply.

We are getting 55k messages in PI.. So need to perform 55k insert operations on DB table..

Thanks,

Soumya.

rajasekhar_reddy14
Active Contributor
0 Kudos

okay,recetly i have inserted 3 lacs records in to data base in 8 hours time, but not sure about your case,try once.

Former Member
0 Kudos

hi,

In case you are using database at ur receiver side.. then go with SP and pass ur entire input data as "XML input to the SP" so in this case ur SP will be called once and the entire proxy data will be inserted in the respective tables....chk with ur DB team abt creation of SP which will read xml data/text as input and do the further parsing..

Note: this will surely results into better performance..

Thanks

Amit

Former Member
0 Kudos

Hi Raj,

Did u perform 3 lack insert operations on your data base table ..? Can u plz explain ur scenario..?

former_member208856
Active Contributor
0 Kudos

you have 50K records, you have to post the same in single execution.

Open the table & post all 50K records in single exection.

Do not make provision of open table post one record & close table & reopen it.

Post all messages in single statement.

do it in one connection.

do not use single record posting.