cancel
Showing results for 
Search instead for 
Did you mean: 

SAP HANA Memory Usage

0 Kudos

Hello,

I am using following queries to check memory consumption of our HANA server. But, Total memory used is showing up way higher than the sum of Code_Stack + Columner Tables +  Row Tables. Am I missing anything else in this equation?

--Total Memory Used (Result 104 GB, which is same as SAP HANA DB memory used on overview tab)
SELECT      round(sum(TOTAL_MEMORY_USED_SIZE/1024/1024/1024)) AS "Total Used GB"
FROM      SYS.M_SERVICE_MEMORY;

-- Code and Stack Size (Result 4 GB)
SELECT      round(sum(CODE_SIZE+STACK_SIZE)/1024/1024/1024) AS "Code+stack GB"
FROM      SYS.M_SERVICE_MEMORY;

-- Memory Consumption of All Columner Tables (Result 13 GB)
SELECT      round(sum(MEMORY_SIZE_IN_TOTAL)/1024/1024/1024) AS "Column Tables GB"
FROM      M_CS_TABLES;

-- Memory Consumption of All Row Tables (Result 3 GB)
SELECT      round(sum(USED_FIXED_PART_SIZE + USED_VARIABLE_PART_SIZE)/1024/1024/1024) AS "Row  Tables GB"
FROM      M_RS_TABLES;

Thanks!

Accepted Solutions (1)

Accepted Solutions (1)

lbreddemann
Active Contributor
0 Kudos

Hi Vimal,

I'm a bit surprised that you assume, that HANA's memory usage consists of only the executable code, stack and the currently allocated memory assigned to row- and column store.

What makes you think that this is all there is?

Please check system monitoring tables like

for detailed insight into memory allocation, usage and assignment.

- Lars

0 Kudos

Hi Lars,

Appreciate your response. I knew I am missing a lot in my equation that adds up to "Memory Used" value on the overview tab. Also, I got some of the queries from following doc -

http://www.saphana.com/docs/DOC-2299

I did look into system monitoring tables mentioned above and if i take them into consideration then my total is going above the Memory used  number (looks like double dipping). All I am trying to do is figure out the break up of total memory used number.Because looking at the current number, total memory used is almost 4 times memory used by column and row tables.

Thanks!

lbreddemann
Active Contributor
0 Kudos

What revision of HANA are you using?

I recall that one of the more recent release notes contained some fixed bug about something like "memory overview values double" or so...

- Lars

former_member184768
Active Contributor
0 Kudos

Hi Vimal,

The document you are refering is quite good, but in my experience it tends to confuse you a little bit, just the way you got lost in the numbers.

The memory management / consumption is a bit tedious and as Lars mentioned the monitoring views will help you identify what has been utilized where.

I'd recommend a different approach, please refer to the administration tab and check the memory consumption. Please don't try to reconcile the numbers from the SQL queries you mentioned, as they do not provide the complete picture.

Apart from the memory consumed in the data and code, there is huge chunk of memory "allocated" from the hardware which is used for temporary operations (like merging resultsets when you fire a sql query, sorting of the result sets ) etc. In my opinion, It is very difficult to find out EXACTLY what is being consumed as some of the allocations are handled by HANA itself and it is represented in the monitoring views mentioned by Lars.

So bottomline, for simple understanding, please refer to the administration tab and check the memory consumption.

Regards,

Ravi

0 Kudos

Hi Lars!

We are still on revision 35. I will look into latest release doc for more info.

Thanks

0 Kudos

Hi Ravi,

The reason I started looking into this is to understand and limit the data we are loading into the HANA. Because, if i look at overview tab then it shows total memory used is 103 GB but out of that, tables are using only 1/5 of the value (about 20GB based on the SQL). Anyway, as Lars mentioned in his post about the bug, I'll look into the latest release document.

Thanks

Former Member
0 Kudos

Hi Vimal,

I'm just reading this thread and I ask myself, why do you want to check the memory consumption?

What is you intension?

Do you want to reduce the memory consumption somehow by identifying the "big part" of memory consumption?

Best Regards,

Marcel

0 Kudos

Hi Marcel,

Yes, that was the primary purpose and that's why this question. Initially, I was thinking of unloading some of the unwanted/unused tables from memory and then step (2) Limit the historical data that I am loading into some of the big fat transaction tables in HANA (or may be some kind of archival strategy).

But, I don't think it's as simple as step 1 and 2.

Regards,

Vimal Shah

former_member184768
Active Contributor
0 Kudos

Hi Vimal,

Please also check if you have some tables which are not currently in the memory, but have the data loaded to it. It is possible to have data loaded much more than the memory size, but not have the data loaded to the memory. Which means, it resides on the disk.

In such case, you may be able to load more data than the your HANA size, but ensuring that the data is not available in the main memory. The memory can be then managed by loading / unloading the required data. This is not recommended as you would prefer to keep all your data in the main memory, but again with the HOT / WARM / COOL data concept in BW on HANA, this is something which can be achieved.

Also please check your delta memory consumption. Try to reduce the delta memory footprint if possible. Rather than waiting for HANA to perform the delta merge, try to do it yourself.

It might help you reduce the overall memory consumption.

Regards,

Ravi

Former Member
0 Kudos

Hi Vimal,

ok, I see. Do you have SAP BW running on top of HANA or how do you use your HANA system?

Actually I was also thinking of a similiar scenario for a customer of us that is using BW on HANA.

The customer needed a SQL script that loads the important data into memory when HANA restarts. However, we were also thinking of using semantic partitioned objects in BW using for instance calendar year for partition, then we write a SQL script that is executed frequently by a BW process chain in order to unload the data again if it was requested.

I think this can be an idea, but at the end we also decided to reduce the memory usage by reducing the data with an optimized data model (reudcing layers) and optimzing the BW housekeeping jobs (deleting PSA, DSO change logs, etc.). If this is still not enough you can think of using sybase IQ as near-line storage. You can also optimize the delta merge behavior of your system in order to reduce the memory consumption - and of course it is important that the delta merge is executed frequently - I remember in the beginning we have had issues that the merge was not executed by the system.

However, you can optimize the memory consumption, but the question is for me at the end - what is more expensive the license and the hardware or the maintenance of all the mentioned activities to reduce the memory consumption.

Best Regards,
Marcel

0 Kudos

Hi Marcel,

We are on pure HANA. There is no BW in the mix. At this time we are looking into upgrading to SP5 and see if that will fix some of memory related bugs (as mentioned by Lars above).

Regards,

Vimal Shah

0 Kudos

Hi Ravi,

We use SLT for the data provisioning so when you say "perform the delta merge yourself", do you mean creating a store procedure and running it on a schedule?

Thanks,

Vimal Shah

former_member184768
Active Contributor
0 Kudos

Hi Vimal,

I was hinting for something like that only. You may want to run a script after you SLT jobs are done to merge the delta to the main memory. You need to ensure that all the data loading operations are completed before running the delta merge.

Regards,

Ravi

former_member184768
Active Contributor
0 Kudos

Hi Vimal,

Also please check if you have history enabled for the tables. The additional history table for a given table is used for time travel query. If it is not a requirement, then you can delete the history data for the table and reclaim the space using ALTER SYSTEM command.

Regards,

Ravi

mathanponnucham
Employee
Employee
0 Kudos

Hello,

You can use smart merge after data loading operation is completed. More info in the below link.

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10681c09-6276-2f10-78b8-986c68efb...

Rgds,

Mat.

justin_molenaur2
Contributor
0 Kudos

I am seeing some inconsistency (my perception) in the system tables available through the adminstrator/System Information tab.

First off - my interest solely lies in quantifying (at a table level) the amount of memory usage. I know this depeneds on whether the table is loaded into memory or not, and I would assume that some of these system tables also account for that.

On to the details...

When we query this table via "System Information", I see the following

This would leave one to believe that we are looking at a roughly 419GB (converted) memory consumption by the Column Store Tables.

Now, when we use the "Used Memory By Tables" System Information View, I can see this.

This would indicate that Used Memory for ALL column tables is roughly 362GB (converted from MB).

A third reconciliation point is just querying the view "SYS"."M_CS_TABLES"

SELECT SUM(ROUND("MEMORY_SIZE_IN_TOTAL"/1024/1024,4)) AS "MEMORY (MB)"

FROM "SYS"."M_CS_TABLES"

ORDER BY "MEMORY (MB)" DESC

With this, we get a result like this, which seems to correlate with the second example from System Information (and is probably sourced from the same tables/views).

So question here is - what is more accurate? Is the M_CS_TABLES view the right place to get concrete memory usage at a table level? If so, why does the SERVICE_COMPONENT query differ so greatly? Is there additional overheard/management for column store tables outside of the delta merge (delta size is included in "MEMORY_SIZE_IN_TOTAL" from M_CS_TABLES)?

Regards,

Justin

Answers (1)

Answers (1)

mathanponnucham
Employee
Employee
0 Kudos

Dear Vimal,

Please refer to the attachement in SAP Note 1514966 - SAP HANA 1.0: Sizing SAP In-Memory Database

While you are doing sizing for HANA system you need to consider the RAM requirements for both static, dynamic data, delta tables, query execution, etc.

As Marcel mentioned What is your intention, based on that you have to implement a solution.

Rgds,

Mat.