cancel
Showing results for 
Search instead for 
Did you mean: 

live cache database growth

former_member246694
Participant
0 Kudos

Dear all,

How can be we get the database growth of livecache database for a period of time?

Please guide me the path to find these values.

Thanks n Regards,

KK

Accepted Solutions (0)

Answers (2)

Answers (2)

former_member229109
Active Contributor
0 Kudos

Hello,

"How can be we get the database growth of livecache database for a period of time?

Please guide me the path to find these values."

1. Please let us know if you need the general statistics of the database growth or please collobarate more about your question.

The Fill level of the database (for example, the size of the data volumes, number of permanently and temporarily used pages) statistics is collected by the DB analyzer every 15min. You could use the DB analyzer & check the statistics in DBAN_FILLING.csv

file which is created every day, if the DB analyzer is activated.

Please review SAP Note 530394 (Bottleneck Analysis with Database Analyzer).

See the MAXDB library documentation:

http://maxdb.sap.com/doc/7_7/default.htm -> Tools -> Database Analyzer < then go to More information: Database Analyzer Log Files >

2. Please let us know the version of the database.

Thank you and best regards, Natalia khlopina

former_member246694
Participant
0 Kudos

Hi Natalia,

Thank youfor the information.

It has served me the purpose of finding the database size everyday.

If I want to compare the DB size for a period " starting date--ending date"...is there a way to get the result in such a way ?

Regards,

KK

former_member229109
Active Contributor
0 Kudos

Hello,

1. You are running SAP SCM 5.1 => You are SAP customer.

So you could create the SAP message so we could discuss your project via SAP message.

We could login to the system to gain more info if needed.

2. Restart the database analyzer after the liveCache restarted in LC10 or setup the DB analyzer to be restarted automaticaly for LCA connection. The DBAN_FILLING.csv file has the entries with 15 min interval by default and this file created every day.

So you could compare the filling of the dataarea after the restart with the last collected statistics of the database filling by the DB analyzer.

And compare the results.

3. Please post more details about your project.

Do you have any issues with liveCache on the system?

< For example, the unexpected growing the filling of the dataarea after the liveCache restarted & run some time ... >

4. For SAP liveCache documentation see the SAP note 767598.

Thank you and best regards, Natalia Khlopina

markus_doehr2
Active Contributor
0 Kudos

Check

Note 352081 - Additional service-relevant functions for MaxDB

and transaction DB50 - Statistics (http://help.sap.com/saphelp_nw70ehp1/helpdata/en/7c/ba4aae86d08840b7281ee810350fcf/frameset.htm)

Markus

former_member246694
Participant
0 Kudos

Hi Markus,

I have observed the following in my system.

DB50: Not available

DB50N: Not taking me to the LCA connection.

LC10: Statistics-> has only 2 sections.

1> Table sizes

2> Class containers.

But they are not giving me the summary of database growth for a period of time as usually we get in DB02.

In tablesizes , the comparison interval is not generating any results..It is collecting the values only for the present day..

Pls guide further..

Thanks n Regards,

KK

Edited by: Anees Qureshy on Jan 12, 2011 8:33 AM

Edited by: Anees Qureshy on Jan 12, 2011 8:33 AM

markus_doehr2
Active Contributor
0 Kudos

> DB50: Not available

> DB50N: Not taking me to the LCA connection.

Did you create the connection to your LiveCache using DB59?

> LC10: Statistics-> has only 2 sections.

> 1> Table sizes

> 2> Class containers.

>

> But they are not giving me the summary of database growth for a period of time as usually we get in DB02.

You run the RSADAT6M job every day?

What SCM/APO release are you running?

Markus

former_member246694
Participant
0 Kudos

Hi Markus,

I guess the report You have mentioned is not listed in the standard job.(SM36). I will take a note of this to schedule this in my dev and qas system to analyse further on the output of the report.

SCM Version: SAP SCM 5.1

Yes we have made a connection in DB59.

What is the frequency of the report to run as job? Where can I see the output?

Regards,

KK

adrian_dorn
Active Participant
0 Kudos

If you have SAP_BASIS Release 7.0 or higher (I think this is the case in SCM 5.1), then this job is NOT sheduled with transaction SM36 but with transaction DB13 by selecting the action "Update Table Statistics".

Note: If you have never used DB13 before, then you should be aware that by default DB13 selects the local database and not the liveCache. So you will have to enter manually "LCA" in the field "System" (on the left top corner of the DB13 screen) before you can shedule DBA jobs for the liveCache. If you click on the "i" button while you mark the action "Update Table Statistics" in DB13, then you'll jump directly into the chapter of the SAP documentation about table sizes. This is quite convenient because this way you don't have to search manually in the documentation. This chapter of the documentation answers all your questions.

The output of the job is what you see in the DB59 node "Table sizes". Or to put it into other words: If you don't shedule this DB13 job, then you won't see anything in in the DB59 node "Table sizes".

The general recommendation is to shedule this DB13 job once a week. But of course you can shedule it once a day if you are interested in comparing the table growth on a daily basis.

former_member246694
Participant
0 Kudos

Hi Adrion,

Thank you for the information.

We have scheduled update statistics scheduled everday in our system,

The following is the log of the job.

"Job started

Step 001 started (program RSADAUP1, variant &0000000000249, user ID BGDUSER)

Job started

Step 001 started (program RSDBAJOB, variant &0000000000200, user ID BGDUSER)

Job finished

The action was performed successfully

Job finished "

The final output that it shows is as shown below

executed : 12 tables | UPD STAT COLUMN : 0 tables

UPD STAT errors : 0 tables | UPD STAT TABLE : 12 tables

UPD STAT excluded : 0 tables |

Anything more to do to get the values?

I have observed that the job Mr.Markus has given in his reply is reporting the values of the tables.

Regards,

KK

adrian_dorn
Active Participant
0 Kudos

That's the wrong job.

The DB13 job which you have scheduled is the "Update all optimizer statistics" job.

You need another DB13 job called "Update Table Statistics".

The names of these two jobs are similar, but they do completely different things: The first one is required to optimize the statistics which are used by the Database Optimizer. And the second job collects the sizes which are shown in the DB59 node "Table sizes".

former_member246694
Participant
0 Kudos

Hi Adrian,

The relevant job for the one which you have mentioned is 'Update statistics for marked tables '..is this the right one?

I do not find the entry for 'update table statistics' exactly.

Regards,

KK

adrian_dorn
Active Participant
0 Kudos

This job is also wrong.

Please post your SAP_BASIS Release and SAP_BASIS Support Package. Perhaps you have a very old Release / Support Package where this DB13 job does not yet exist.