on 08-30-2016 4:58 PM
Hello all,
Could somebody help me correctly interpret the working of "statement_memory_limit" parameter by HANA. I know its a memory limit per statement however want to know which one of below is correct. Here memory is
-- total memory in entire lifespan of statement execution
-- Or else memory used at any point of time during lifespan of statement execution
In my opinion, it means memory used at any point of time during lifespan of statement execution that also explains below behavior.
Secondly, suppose "statement_memory_limit" is set to 32GB and for sql query the ACTUAL execution plan show 225.3 GB does it mean this query
should not be able to run ?
I am noticing query does get executed and actual memory noticed in HANA overview window doesn't even spikes more than 2-3 GB until it starts showing result set on HANA studio.
Lastly, does any one has guideline with practical use case in setting this parameter "statement_memory_limit" ; like is it good to keep high say 10,20,30 GB or low 2,4,6,8 GB. I know to decide on this one needs to keep in mind concurrency, queries complexity, dataset it work on but a practical analysis done by somebody will be helpful. I believe it should be lower like max 5 GB which should be sufficient to run all general queries ; this will enable higher concurrency and sanity in HANA system overall.
Regards,
Nitin
Hi Nitin,
1) The explanation of parameter statement_memory_limit:
2) Note 2302903 - HANA PlanViz "Memory Allocated" figure is higher than the statement memory limit explains the reason.
3) The default configuration for this parameter is no limit (for me, I never set the limit before). I can't say which is the best memory limitation for statement. However, you should test the effects of these settings carefully in order to avoid unexpected results (e.g. termination of backups or critical business queries).
For more information, please check:
2000002 - FAQ: SAP HANA SQL Optimization
2154870 - How-To: Understanding and defining SAP HANA Limitations
Known problem:
2289105 - SAP HANA DB: Standstill Situation During OOM
Regards,
Ning
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thank you Ning and Martin for quick and helpful answer. This confirms what I was thinking was correct. For setting "statement_memory_limit", I also concur 15 to 30 % of the global_allocation_limit is a good starting point.
We can keep in general "statement_memory_limit" following above however for admin or system account for backup can have superseding higher value.
Overall, memory limit setting should be geared towards enabling more concurrency and faster response especially in a self service BI environment. Special BI requiring higher memory should be managed specially like scheduling/publication run during off time and through special account having higher allowed memory set.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I assume that the 225 GB is the accumulated memory allocated over time. The statement_memory_limit limits the peak memory utilization at any specific point in time. So both is not a contradiction. According to SAP Note 2222250 a value of 15 to 30 % of the global_allocation_limit is a good starting point. This is also suggested by the standard SAP HANA parameter check (SQL: "HANA_Configuration_Parameters", SAP Note 1969700).
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
87 | |
23 | |
11 | |
9 | |
8 | |
5 | |
5 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.