on 03-14-2008 8:19 PM
**We are running into a strange issue. After a successful migration from SOLARIS to AIX of our XI development environment & running for a month we are facing an error starting the java dispatcher. The dispatcher comes up but dies very soon.
We have a message open with SAP with all the heap and java core files. Want to see if anyone has seen this issue or has a clue why this will be happening, we have bumped up the memory to 11GB on the server level but looks like something was running before this happened and now java is trying to finish that task and keeps on consuming memory. A restart of the server has not helped either.
Is there a way to find out what is the J2ee engine trying to finish up?
The messages in the std_server0.out have the following**
JVMDUMP006I Processing Dump Event "systhrow", detail "java/lang/OutOfMemoryError" - Please Wait.
JVMDUMP007I JVM Requesting Snap Dump using '/usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/Snap0001.20080314.193310.1106138.trc'
JVMDUMP012E Error in Snap Dump: {nothing to snap}
JVMDUMP007I JVM Requesting Heap Dump using '/usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/heapdump.20080314.193310.1106138.phd'
JVMDUMP010I Heap Dump written to /usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/heapdump.20080314.193310.1106138.phd
JVMDUMP007I JVM Requesting Java Dump using '/usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/javacore.20080314.193310.1106138.txt'
JVMDUMP010I Java Dump written to /usr/sap/XID/DVEBMGS00/j2ee/cluster/server0/javacore.20080314.193310.1106138.txt
JVMDUMP013I Processed Dump Event "systhrow", detail "java/lang/OutOfMemoryError".
FATAL: Caught OutOfMemoryError! Node will exit with exit code 666java.lang.OutOfMemoryError
Hello Rommel,
As I mentioned in my response, it is not SAP Memory Analyzer but IBM Memory Analyzer that you need :-).
You can download it from :
http://www.alphaworks.ibm.com/tech/heapanalyzer
Hope this helps.
Regards,
Snehal
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Make sure you have settings configured as per 723909.
The best and the only way to analyze what is consuming so much memory is by using the heapdump.
Please download IBM Heap Analyzer from the ibm site and install the same.
Open the heapdump with this and analyze it as per the turorial. You may find some hint.
Hope this helps.
Regards,
Snehal
Edited by: Snehal Bhaidasna on Mar 17, 2008 11:53 AM
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks, however from the Memory Analyzer link it shows that it doesn't support IBM platform yet.
What heap dumps are supported?
SAP Memory Analyzer supports HPROF binary heap dumps, a de-facto standard of Sun supported also by other vendors:
Sun, SAP and HP JDK/JVM from version 1.4.2_12 and 5.0_7 and 6.0 upwards
IBM doesn't support HPROF binary heap dumps. Therefore IBM heap dumps can't be analyzed with the SAP Memory Analyzer. We know of no tool to convert an IBM heap dump into an HPROF binary heap dump, but surely such a converter (System Dump/DTFJ API -> HPROF binary heap dump) could be written.
Still this is good to know and thanks for your answers.
paste the complete developer trace...from the Console
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Found a good OSS note which seems to have resolved this Issue
Note 994433 - XI AF on J2EE Engine terminates with OutOfMemoryError
Seems like the table sapsr3db.xi_af_msg at the database level keeps a status of work to be done and that status flag has to be updated as per the note to let the dispatcher come up.
User | Count |
---|---|
84 | |
24 | |
11 | |
9 | |
7 | |
6 | |
5 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.