cancel
Showing results for 
Search instead for 
Did you mean: 

Dispatcher fails to Come up - dies with message - "java/lang/OutOfMemoryErr

rommel_bhan
Explorer
0 Kudos

**We are running into a strange issue. After a successful migration from SOLARIS to AIX of our XI development environment & running for a month we are facing an error starting the java dispatcher. The dispatcher comes up but dies very soon.

We have a message open with SAP with all the heap and java core files. Want to see if anyone has seen this issue or has a clue why this will be happening, we have bumped up the memory to 11GB on the server level but looks like something was running before this happened and now java is trying to finish that task and keeps on consuming memory. A restart of the server has not helped either.

Is there a way to find out what is the J2ee engine trying to finish up?

The messages in the std_server0.out have the following**

JVMDUMP006I Processing Dump Event "systhrow", detail "java/lang/OutOfMemoryError" - Please Wait.

JVMDUMP007I JVM Requesting Snap Dump using '/usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/Snap0001.20080314.193310.1106138.trc'

JVMDUMP012E Error in Snap Dump: {nothing to snap}

JVMDUMP007I JVM Requesting Heap Dump using '/usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/heapdump.20080314.193310.1106138.phd'

JVMDUMP010I Heap Dump written to /usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/heapdump.20080314.193310.1106138.phd

JVMDUMP007I JVM Requesting Java Dump using '/usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/javacore.20080314.193310.1106138.txt'

JVMDUMP010I Java Dump written to /usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/javacore.20080314.193310.1106138.txt

JVMDUMP013I Processed Dump Event "systhrow", detail "java/lang/OutOfMemoryError".

FATAL: Caught OutOfMemoryError! Node will exit with exit code 666java.lang.OutOfMemoryError

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hello Rommel,

As I mentioned in my response, it is not SAP Memory Analyzer but IBM Memory Analyzer that you need :-).

You can download it from :

http://www.alphaworks.ibm.com/tech/heapanalyzer

Hope this helps.

Regards,

Snehal

Answers (2)

Answers (2)

Former Member
0 Kudos

Make sure you have settings configured as per 723909.

The best and the only way to analyze what is consuming so much memory is by using the heapdump.

Please download IBM Heap Analyzer from the ibm site and install the same.

Open the heapdump with this and analyze it as per the turorial. You may find some hint.

Hope this helps.

Regards,

Snehal

Edited by: Snehal Bhaidasna on Mar 17, 2008 11:53 AM

rommel_bhan
Explorer
0 Kudos

Thanks, however from the Memory Analyzer link it shows that it doesn't support IBM platform yet.

What heap dumps are supported?

SAP Memory Analyzer supports HPROF binary heap dumps, a de-facto standard of Sun supported also by other vendors:

Sun, SAP and HP JDK/JVM from version 1.4.2_12 and 5.0_7 and 6.0 upwards

IBM doesn't support HPROF binary heap dumps. Therefore IBM heap dumps can't be analyzed with the SAP Memory Analyzer. We know of no tool to convert an IBM heap dump into an HPROF binary heap dump, but surely such a converter (System Dump/DTFJ API -> HPROF binary heap dump) could be written.

Still this is good to know and thanks for your answers.

nisarkhan_n
Active Contributor
0 Kudos

paste the complete developer trace...from the Console

rommel_bhan
Explorer
0 Kudos

Found a good OSS note which seems to have resolved this Issue

Note 994433 - XI AF on J2EE Engine terminates with OutOfMemoryError

Seems like the table sapsr3db.xi_af_msg at the database level keeps a status of work to be done and that status flag has to be updated as per the note to let the dispatcher come up.