cancel
Showing results for 
Search instead for 
Did you mean: 

Recorded for Outbound Processing

Former Member
0 Kudos

Hi All,

Very Good Morning.

My message having large kind of payload. When i test my scenario with small kind of that data, Message successfully stored to target system database using the stored procedure,

But in my real time scenario. it having large kind of the data(for ex:170Mb) of data,Messages are not processed target end. It staying queues and status in SXMB_Moni shows " Arrow mark having Recorded for Outbound Processing".

queue name is "XBTM0000" shows in SMQ2,

Can provide me your valuble suggestion how i able to solve my issue.

Thank you

Sateesh

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi Sateesh,

Can u please register the queues using SXMB_ADM.

Go to SXMB_ADM->Manage queues->Register Queues->Activate Queues.

Please remove/activate the queue objects before doing these steps.

Once u will do all the steps then objects are not struck in queues

Thanks

Ravi

Answers (3)

Answers (3)

stefan_grube
Active Contributor
0 Kudos

> Can provide me your valuble suggestion how i able to solve my issue.

Use sizing guide and set up your system accordingly.

Or use smaller files.

rajasekhar_reddy14
Active Contributor
0 Kudos

Hi,

Trasfering 170 MB file causing performace issue,may be this could be the reason .

Regards,

Raj

former_member187339
Active Contributor
0 Kudos

Hi Sateesh,

what is the status of the LUW in the Queue? Try to activate & unlock the queue

Regards

Suraj

Former Member
0 Kudos

Hi Suraj,

Thanks for your reply.

My LUW shows the User:WF-BATCH,Function module:SXMS_ASYNC_EXEC, Queuename:XBTM0000, Status:Time Limt exceeded.

I test this message yesterday it shows the Time Limit exceeded"

Thanks.

Sateesh

former_member187339
Active Contributor
0 Kudos

Hi Sateesh,

Check this blog /people/michal.krawczyk2/blog/2006/06/08/xi-timeouts-timeouts-timeouts

Try to increase the timeout for JDBC adapter in Visual admin or NWA.

Regards

Suraj

Former Member
0 Kudos

Hi Suraj,

I just confirm you , what the status earlier.

These queue message doen't call the JDBC receiver adpater, Even there is no error and there no polling in JDBC adatpter.

I tested my scenario last week also , at that time the queue messages are shows the "Transaction Exectuting" for last five days.

I deleted that queues messages yesterday, I tested the new polling yesterday after some time the message shows the time limit exceeded". But earlier queue messages are not showing the "time limit exceeded" error.

Any way i will try increase the JDBC adpter time according to michael blog.But my how much time i will increase here?

and what standard time for JDBC adapter?

Thank you so much.

Sateesh

vijay_kumar133
Active Participant
0 Kudos

Hi Sateesh,

If the input datarecords are huge we can split the records by seeting the parameters in adavance mode section in sender jdbc adapter.

the parameters are

msgLimit(boolean)

maxMsgSize(int)

maxRowSize(int)

and even you can caluculate the max size by using the

Max Rows = Maximum Message Size /( 2 * Maximum Row Size )

check the note for more clarifications on the above caluculations

SAP Note 1253826 - Configuring Maximum Message Size Limits

Regards

Vijay Kumar G

Former Member
0 Kudos

Hi Vijay ,

Thanks for your inputs.

Here my scenario is JDBC>XI <----->JDBC.

JDBC receiver is Synchoronous,I am seleting the records from target JDBC and as a response message is mapped to Source response structure and stored the data in Source JDBC.

I am using the one JDBC Sender,two JDBC receiver adapter, my scenario is successful when XI select small data for testing purpose.In our real time scenario we have large amount of data only.

But here i am using the JDBC Receiver adapter to call the stored procedure and insert the records, if the records inserts the successfully Log table is created, For that thing i am using the Stored procedure.

Is it possible to configure the JDBC Receiver adapter for Large amount of data?

Thanks

Sateesh

former_member187339
Active Contributor
0 Kudos

Hi Sateesh,

Are you using BPM?

>>Is it possible to configure the JDBC Receiver adapter for Large amount of data?

One of the disadvantage of JDBC adapter is that it tend to break easily with large data size, but here your message has not yet reached jdbc adapter.

I guess there is a problem with your design of the interface.

Regards

Suraj

Former Member
0 Kudos

Hi Suraj,

In this scenario i am using BPM ,

My BPM Steps are Receive1-Transformation1-Send1Synchornously-Transformation2-Send2.BPM Graphflow shows completed upto send2 step,There is no error in BPM.

I tested same scenario using small kind of data(for my test system) more than 2000 records. it,s succesfly stored in the target end. But in real time data having more than 1 and half laks records.when i am using the real time scenario data i am getting this problem.

My BPM is successful proceed completly upto final step when i am using the real time data but my message is stucked queue.

For example my message flow is

1)Sender service to BPM(Receive step)(Request Message)

after transformation step

2)BPM to Receiver service(Send Synchronously)

3) Receive service to BPM

after transformation step

4) BPM to Sender service(send step)

Upto 4 th step my message are display in SXMB_Moni, My 4 th shows the arrow mark having recorded with outbound processing.

Thanks,

Sateesh

stefan_grube
Active Contributor
0 Kudos

> But here i am using the JDBC Receiver adapter to call the stored procedure and insert the records, if the records inserts the successfully Log table is created, For that thing i am using the Stored procedure.

Use a stored procedure in sender channel to restrict the volume of data.

It does not make any sense to send all database entries in one large message.

Would you write an ABAP program which reads the whole database at once and holds it in an internal table?

vijay_kumar133
Active Participant
0 Kudos

Hi Sateesh,

Can you clear me bit more clear reg your scnario.

JDBC sender will pick the data and the same data has been updated at receiver jdbc. Then after that you are collecting standard response of the receiver side and giving into another database is this..

I have dout after updating in one receiver what kind of data r u getting back to xi to update the same into other database.

Regards

Vijay

Former Member
0 Kudos

Hi Vijay,

My scenario steps

JDBC(sql Server)> XI<-->JDBC(oracle)

1) XI will pick "Date" field from source JDBC(Sql Server).

2) after XI will go to target JDBC(oracle) for selecting the records based upon the date field condition

My condition looks like for ex: Select empno,Empname,adrress where date > Date;

3) Now XI pick the response from target JDBC(oracle) .this response data will insert into the source JDBC(Sql server).

Thanks,

Sateesh.

rajasekhar_reddy14
Active Contributor
0 Kudos

Hi Satish,

as i already said,1.5 lac records(170mb )file causing problem,its not adviceble to send huge files,try to change the design asp er stefan inputs.

Regards,

Raj