cancel
Showing results for 
Search instead for 
Did you mean: 

Query to limit values showed by time stamp

former_member248450
Active Participant
0 Kudos

Dear experts,

I am using this query to get the disk size and it is showing me the latest 1000 entries because we limit that size in studio.

select SNAPSHOT_ID, HOST, TOTAL_SIZE, USED_SIZE from _SYS_STATISTICS.global_disks where USAGE_TYPE = 'DATA' ORDER by SNAPSHOT_ID DESC


Is it possible to get the size between cetrain "snapshot_id"



something like select SNAPSHOT_ID, HOST, TOTAL_SIZE, USED_SIZE from _SYS_STATISTICS.global_disks where USAGE_TYPE = 'DATA' and SNAPSHOT_ID ='2016-02-14 12:00:00'


I want to get disk size at a certain point of time two weeks a go. Can anyone suggest where to modify.



Thanks,
Jun

Accepted Solutions (1)

Accepted Solutions (1)

anindya_bose
Active Contributor
0 Kudos

Can you please try this ?  Don't have access to SYS_STATISTICS schema right now .. Could not test myself.

select SNAPSHOT_ID, HOST, TOTAL_SIZE, USED_SIZE from _SYS_STATISTICS.global_disks

where USAGE_TYPE = 'DATA'  AND SERVER_TIMESTAMP BETWEEN 'YYYYMMDDHHMMSS' and 'YYYYMMDDHHMMSS'

ORDER by SNAPSHOT_ID DESC

By the way , you can change default value of 1000 to something else in case you need it .  Should be under 

Windows --->Preference ----> SAP HANA ---> Runtime --->Result

( Path may change based on your Studio version )

Regards

Anindya

former_member248450
Active Participant
0 Kudos

I was able to get the disksize. However, our server timestamp and snapshot id are not same. so need to do the math.

is it possible to get for a particluar point in time?

select SNAPSHOT_ID, HOST, TOTAL_SIZE, USED_SIZE from _SYS_STATISTICS.global_disks

where USAGE_TYPE = 'DATA'  AND SERVER_TIMESTAMP =  "20150302120000"

anindya_bose
Active Contributor
0 Kudos

You can but then it would try to match exact time stamp at the seconds level.  You might not get any record.  So, using a range with "Between" is better .  Like, if you want to see what happened in 5 mins time frame, you can adjust TO and FROM timestamp accordingly.

can you please provide a sample data set ?  not able to see data myself as I don't have authorization for SYS_STATISTICS schema right now.

Regards

Anindya

former_member248450
Active Participant
0 Kudos

Below is some test data. Thanks

    ;SNAPSHOT_ID               ;HOST   ;TOTAL_SIZE     ;USED_SIZE   

1   ;Feb 25, 2016 3:16:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

2   ;Feb 25, 2016 3:21:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

3   ;Feb 25, 2016 3:26:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

4   ;Feb 25, 2016 3:31:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

5   ;Feb 25, 2016 3:36:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

6   ;Feb 25, 2016 3:41:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

7   ;Feb 25, 2016 3:46:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

8   ;Feb 25, 2016 3:51:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

9   ;Feb 25, 2016 3:56:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

10  ;Feb 25, 2016 4:01:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

11  ;Feb 25, 2016 4:06:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

12  ;Feb 25, 2016 4:11:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

13  ;Feb 25, 2016 4:16:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

14  ;Feb 25, 2016 4:21:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

15  ;Feb 25, 2016 4:26:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

16  ;Feb 25, 2016 4:31:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

17  ;Feb 25, 2016 4:36:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

18  ;Feb 25, 2016 4:41:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

19  ;Feb 25, 2016 4:46:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

20  ;Feb 25, 2016 4:51:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

21  ;Feb 25, 2016 4:56:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

22  ;Feb 25, 2016 5:01:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

23  ;Feb 25, 2016 5:06:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

24  ;Feb 25, 2016 5:11:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

25  ;Feb 25, 2016 5:16:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

26  ;Feb 25, 2016 5:21:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

27  ;Feb 25, 2016 5:26:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

28  ;Feb 25, 2016 5:31:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

29  ;Feb 25, 2016 5:36:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

30  ;Feb 25, 2016 5:41:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

31  ;Feb 25, 2016 5:46:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

32  ;Feb 25, 2016 5:51:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

33  ;Feb 25, 2016 5:56:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

34  ;Feb 25, 2016 6:01:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

35  ;Feb 25, 2016 6:06:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

36  ;Feb 25, 2016 6:11:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

37  ;Feb 25, 2016 6:16:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

38  ;Feb 25, 2016 6:21:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

39  ;Feb 25, 2016 6:26:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

40  ;Feb 25, 2016 6:31:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

41  ;Feb 25, 2016 6:36:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

42  ;Feb 25, 2016 6:41:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

43  ;Feb 25, 2016 6:46:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

44  ;Feb 25, 2016 6:51:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

45  ;Feb 25, 2016 6:56:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

46  ;Feb 25, 2016 7:01:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

47  ;Feb 25, 2016 7:06:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

48  ;Feb 25, 2016 7:11:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

49  ;Feb 25, 2016 7:16:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

50  ;Feb 25, 2016 7:21:09.0 AM ;hostserv;549,486,329,856;9,701,916,672

anindya_bose
Active Contributor
0 Kudos

Ok.  In that case, you can use the between clause for SNAPSHOT_ID .

SNAPSHOT_ID  BETWEEN 'YYYYMMDDHHMMSS' and 'YYYYMMDDHHMMSS'



Do you see more than 30 days of data in this table ?  I this table stores only last 30 days of data .  Are you looking for beyond 30 days ?



Regards

Anindya

former_member248450
Active Participant
0 Kudos

I see 42 days of data. No, I am not looking beyond that for now.

Answers (0)