on 06-03-2011 8:48 AM
Hi All,
We are getting the data from SAP HR system and we need to post this data to the partner system. So we configured Proxy(SAP) to File(Partner) scenario. We need to append the data of each message to the target file. Scince this is a very critical interface, we have used the dedicated queues. The scenario is working fine in D. When the interface transported to Q, they tested this interface with full load i.e with 55,000 messages.All messages are processed successfully in Integration Engine and to process in Adapter engine, it took nearly 37 hrs. We need to post all 55,000 records with in 2 hrs.
The design of this interface is simple. We have used direct mapping and the size of each message is 1 KB. But need to append all messages to one file at the target side.We are using Advantco sFTP as receiver adapter and proxy as a sender.
Could you please suggest a solution to process all 55,000 messages with in 2hrs.
Thanks,
Soumya.
How you are sending data from Proxy side ?
it should be fill in a single packet (50000 records), if it is one by one record, it will give you problem.
receive single packet from proxy & pass the same to File.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Sandeep,
Thanks for the reply.. From SAP, they will a trigger 55,000 rmessages.Each message has only 1 record(<1KB). One approach is to collect some no of messages from proxy in PI and send to the target side.. For this, we need to implement BPM. Is there any other approach to achive this with out using BPM.
Thanks,
Soumya.
Hi Soumya,
I understand your scenario as, HR data has be send to third party system once in a day. I guess, they are synchronizing employee (55,000) data in third party system with SAP HR data, daily.
I would design this scenario as follows:-
I will ask ABAPer to write a ABAP program, which run at 12:00, pickup 55,000 records from SAP HR tables and place them in one file. That file will be placed in SAP HR file system (you can see it using al11). At 12:30, PI File channel will pick up the file and transfer the file to third party target system as it is, without any transformation. File to File, pass through scenario (no ESR objects). Now, ask the target system to take the file, run their program (they should have some SQL routines). That SQL program will insert these records into target system tables.
If 55,000 records make huge file at SAP HR system, ask ABAPer to split it into parts. PI will pick them in sequence based on file name.
In this approach, I would ask both SAP HR (sender) and third party (target) system people to be flexible. Otherwise, I would say, it is not technically possible with current PI resources. In my opinion, PI system is middleware, not system in which huge computations can be done. If messages are coming from different systems, then collection in middleware makes sense. In your case, collecting large number of messages from single system, at high frequency is not advisable.
If third party target system people are not flexible, then go for File to JDBC scenario. Ask SAP HR ABAPer to split input file into more number of files (10-15, you PI system should be able to handle). At receiver JDBC, use native SQL. You need java mapping to construct, SQL statements in PI. Donu2019t convert flat file to JDBC XML structure, in your case PI cannot handle huge XML payload.
You have to note, hardware upgrade is very difficult (you need lot of approvals depending your client process) and very costly. In my experience hardware upgrade will take 2-3 months.
Regards,
Raghu_Vamsee
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
i feel design of interface is not correct, better to handle appending stuff in ABAP proxy level it self,or try to append 1000 records in to single message in ABAP proxy, so that burdern on Adapter engine would reduced.
internally evry adapter can process 5 messages at a time,most of the standard adapters thread allocation was 5.
and one more thing definetly AE perfromce not good in your case, better to take help from Basis team to tune performac of AE.
Reagrds,
Raj
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
another factor could be the server sizing.
Is the QA server sized enough to handle such a load? If not, the testing is not realistic.
Also have you tried using message packaging?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
an ideal design if you ask me will be;
1. ABAP proxy should not send out all messages at one time. Use a loop and send a set of messages say 5000 per trigger.
2. Use message packaging - this ideally should give you a better performance
3. If sequencing is important, then trigger the ABAP proxy in EOIO mode - /people/arulraja.ma/blog/2006/08/18/xi-reliable-messaging-150-eoio-in-abap-proxies
Hi,
Thanks a lot for all your replies.
Right now we are using sFTP adapter at the receiver side. Instead of sFTP ,If we connect to database at the target side and tryng to post the data into one DB table, is there any performance improvement with JDBC channel instead of sFTP..? Please suggest.
Thanks,
Soumya.
Hi Soumya,
it is completely depend on how you are going to insert data in to table at one trasaction or multiple.,that means to insert 55k records if your going to perform 55 insert staements def it will take time ,but not more than one day.
JDBC max concureencey is 5, using JDBC is one right option,but once again it depends on your server.
Regards,
Raj
hi,
In case you are using database at ur receiver side.. then go with SP and pass ur entire input data as "XML input to the SP" so in this case ur SP will be called once and the entire proxy data will be inserted in the respective tables....chk with ur DB team abt creation of SP which will read xml data/text as input and do the further parsing..
Note: this will surely results into better performance..
Thanks
Amit
you have 50K records, you have to post the same in single execution.
Open the table & post all 50K records in single exection.
Do not make provision of open table post one record & close table & reopen it.
Post all messages in single statement.
do it in one connection.
do not use single record posting.
User | Count |
---|---|
93 | |
10 | |
10 | |
9 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.