cancel
Showing results for 
Search instead for 
Did you mean: 

Heap dump file size vs heap size

Former Member
0 Kudos

Hi,

I'd like to clarify my doubts.

At the moment we're analyzing Sun JVM heap dumps from Solaris platform.

Observation is that heap dump file is around 1,1GB while after loading to SAP Memory Analyzer it displays statistics: "Heap: 193,656,968" which as I understood is size of heap.

After I run:

jmap -heap <PID>

I get following information:

using thread-local object allocation
Parallel GC with 8 thread(s)

Heap Configuration:
   MinHeapFreeRatio = 40
   MaxHeapFreeRatio = 70
   MaxHeapSize      = 3221225472 (3072.0MB)
   NewSize          = 2228224 (2.125MB)
   MaxNewSize       = 4294901760 (4095.9375MB)
   OldSize          = 1441792 (1.375MB)
   NewRatio         = 2
   SurvivorRatio    = 32
   PermSize         = 16777216 (16.0MB)
   MaxPermSize      = 67108864 (64.0MB)

Heap Usage:
PS Young Generation
Eden Space:
   capacity = 288620544 (275.25MB)
   used     = 26593352 (25.36139678955078MB)
   free     = 262027192 (249.88860321044922MB)
   9.213949787302736% used
From Space:
   capacity = 2555904 (2.4375MB)
   used     = 467176 (0.44553375244140625MB)
   free     = 2088728 (1.9919662475585938MB)
   18.27830779246795% used
To Space:
   capacity = 2490368 (2.375MB)
   used     = 0 (0.0MB)
   free     = 2490368 (2.375MB)
   0.0% used
PS Old Generation
   capacity = 1568669696 (1496.0MB)
   used     = 1101274224 (1050.2569427490234MB)
   free     = 467395472 (445.74305725097656MB)
   70.20434109284916% used
PS Perm Generation
   capacity = 67108864 (64.0MB)
   used     = 40103200 (38.245391845703125MB)
   free     = 27005664 (25.754608154296875MB)
   59.75842475891113% used

So I'm just wondering what is this "Heap" in Statistic Information field visible in SAP Memory Analyzer.

When I go to Dominator Tree view, I look at Retained Heap column and I see that they roughly sum up to 193,656,968.

Could someone put some more light on it?

thanks

Michal

Accepted Solutions (0)

Answers (1)

Answers (1)

former_member197208
Participant
0 Kudos

Hi Michal,

that looks indeed very odd. First, let me ask which version do you use? We had a problem in the past where classes loaded by the system class loader were not marked as garbage collection roots and hence were removed. This problem is fixed in the current version (1.1). If it is version 1.1, then I would love to have a look at the heap dump and find out if it is us.

Having said that, this is what we do: After parsing the heap dump, we remove objects which are not reachable from garbage collection roots. This is necessary, because the heap dump can contain garbage. For example, the mark-sweep-compact of the old/perm generation leaves some dead space in the form of int arrays or java.lang.Object to win time during the compacting phase: by leaving behind dead objects, not every live object has to be moved which means not every object needs a new address. This is the kind of garbage we remove.

Of course, we do not remove objects kept alive only by weak or soft references. To see what memory is kept alive only through weak or soft references, one can run the "Soft Reference Statistics" from the menu.

Kind regards,

- Andreas.

Edited by: Andreas Buchen on Feb 14, 2008 6:23 PM

Former Member
0 Kudos

Hi,

sorry for forgotting about versions.

Memory Analyzer: Version 1.1.1 - The Diligent Digger

Application was running on Sun JRE 1.5.0_14

Heap dump obtained using jmap coming from Sun JDK 1.5.0_14

Unfortunately passing over heap dump file seems to not be feasible however I'll ask people in my company. Can I provide you any other information that would be useful for you ?

former_member197208
Participant
0 Kudos

Hi Michal,

I'd like to provide you with a version with verbose debug messages (I'll add some stuff). Can you contact me at andreas dot buchen at sap dot com?

- Andreas.