on 05-09-2011 8:14 AM
Hi consultant,
As you see in attachment, I find in some large tables, update
statistics is always too old(SQL server auto updates when modified
counter > 20%?). This causes poor performance in some specific queries.
Should I manually update statistics? How often should I run it based on
your expenience, weekly? Does it impact system performance becuase of
recompiling? Does it refresh DMVs of expensive SQL in DBACOCKPIT and
make expensive SQL nearly undetectable?
---
Index ~M
MANDT, MATNR, WERKS, LGORT, BWART, SOBKZ
Index ~Z02
MANDT, /BEV2/ED_AEDAT, BWART, WERKS
---
"SELECT *
FROM qa2.""MSEG""
WHERE ""MANDT"" = '218'
AND ""MATNR"" LIKE 'AS%'
AND ""WERKS"" = 'CS51'
AND ""BWART"" = '55Z'
AND ""/BEV2/ED_AEDAT"" BETWEEN '20110101' AND '20110430'" ~Z02 3978ms 91042ms wrong. Update stat is too old
"SELECT *
FROM qa2.""MSEG""
WHERE ""MANDT"" = '218'
AND ""MATNR"" LIKE 'AS%'
AND ""WERKS"" = 'CS51'
AND ""BWART"" = '55Z'
AND ""/BEV2/ED_AEDAT"" BETWEEN '20110101' AND '20110430'" ~M 4014 ms 103164 ms correct. Update stat is up-to-date after manually update statistics
Hi Dennis, hi majamil!
This is a common misunderstanding in MS SQL Server administrators. The fact that statistics are automatically calculated in SQL Server after a certain threshold of changes (read more on this e.g. in this Microsoft Technet link) does not mean that Microsoft did a bad design on this.
In some platforms, e.g. Oracle, it is necessary to do so as part of the basic housekeeping on the database, but in fact, it is very rare that you really obtain a benefit from a statistics update in SQL Server, as its architecture is completely different.
If you have a performance problem in your database with a particular SQL statement (you can identify that in transaction DBACOCKPIT, under "Performance", "SQL Statements", and also in a ST05 trace on the slow transaction, report...), you should analyze the root cause for it in order to be able to permanently solve it. In this sense, you should check the execution plan being used, determine why is so bad and try to understand why the Query Optimizer selected it. Only when you are in that point, you have enough information to really decide how to improve the performance. Of course, it could be that the solution was to update the statistics, and so just updating them would solve the issue; but most probably it was not the solution (or at least you will never know why it was solved) and so, every time that the execution plan is recalculated (because you force it, or simply because the SQL Server engine was restarted, etc.) you will be on risk of having the same problem again, and again you will not be sure that simply updating the statistics (that sometimes is a very costly operation) will solve your problem.
So, whenver you have a problem with a particular SQL Statement, I recommend you to try to understand the reason for a bad execution plan and, in case you need help to analyze it, open a message to SAP on the BC-DB-MSS component clearly indicating what is the statement (the best is that you trace the issue and provide the date and time and the user with which you created the trace so that the SAP consultant is able to immediately start working on it).
I hope that this is useful for you!
Cheers!
-Jesú
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
hi Dear,
as per my experience, You have to manually update the statistics on weekly basis. sometime indexes of some tables did not update with Tools, So, you can use tx DB20 for these indexes.
Regards,
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
a very good article from the MS SQL Server Team about statistics can be found here:
Kind regards
Manuel
User | Count |
---|---|
87 | |
10 | |
10 | |
10 | |
7 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.