Skip to Content

Storing SAP DB Control Center Historical Data in SAP HANA Dynamic Tiering

This document provides instructions on how to selectively move SAP DB Control Center historical data from its in-memory SAP HANA repository to extended tables in SAP HANA Dynamic Tiering.

SAP HANA with the Dynamic Tiering (DT) option enables the migration of ‘hot’ and ‘warm’ data from SAP HANA in-memory tables to Dynamic Tiering extended storage. ‘Hot’ data is data that resides in memory, allowing maximum access performance, whereas ‘warm’ data does not (always) reside in memory; ‘warm’ data isn’t accessed as often as ‘hot’ data so moving it to extended storage frees up memory.  This paper will explain the steps on how to move SAP DB Control Center (DCC) historical data from its in-memory tables to extended storage. Doing so will allow you to keep all historical data that DCC generates – the most up-to-data historical data will be ‘hot’ while the older data will be ‘warm’.

PREREQUISITES

To move historical data from DCC tables to the DT extended tables, you require the following:

  • Access to an SAP HANA SPS09 or higher system
  • SAP DB Control Center (DCC) SP10 or higher
  • Both the DCC and the SAP HANA Dynamic Tiering component installed on that SAP HANA system
  • Basic working knowledge of SAP HANA Studio

If you would like to learn more about SAP HANA Dynamic Tiering, please review the following quick start guide: http://scn.sap.com/docs/DOC-66016.

CHANGE PURGE SETTINGS FOR STORAGE OF HISTORICAL DATA

We will first go over how to update the purge settings on DCC. DCC deletes records that are older than a max age that we will set up in the settings (the default max age is 30 days).  For example, if our purge settings were adjusted to have a max age of one month, then in intervals of 5 minutes, records older than 1 month will be deleted.  The purpose of purging is to delete old data to make space for newly generated data and that is why it happens often.  We want to set a high value for max age so that we have enough time to transfer the data because if the max age is set to a very small value, it can potentially result in the deletion of new data before it gets moved to extended storage. As such, we’ll set the max purge age to an artificially high number like 5256000 minutes (1 year) so that it doesn’t need to be monitored constantly.

Follow the instructions below to update the purge settings for DCC:

  1. Right click on your SYSTEM user.
  2. Click "Open SQL Console".
  3. Copy and paste the following script onto the console.


  4. Click the Deploy icon to execute the script.

CREATE EXTENDED TABLES

To migrate data from in-memory to extended storage, we’ll have to first create an extended table with the same schema as the table in in-memory that we’re transferring data from. The following instructions will guide you on how to do that:

  1. Right click on your SYSTEM user.
  2. Click "Open SQL Console".
  3. Copy and paste the following script onto the console.  This script contains the SQL statements that, when run, will create a table in extended storage. For the purpose of this exercise, we will be naming the table ‘test_DT’.













  4. Click the Deploy icon.
  5. Left click on your SYSTEM user.
  6. Click log off.
  7. Double click on your SYSTEM user to restart the user.
  8. Click SYSTEM > Catalog > SAP_HANA_DBCC > Tables.

The table that you created in this section will now be found in this folder. This is an empty table in extended storage. You will also notice that this table listed in the catalogue will also have EXTENDED noted beside its name. This is to inform the user that the table is an extended table.

MIGRATING SELECTED HISTORICAL DATA FROM IN-MEMORY TO EXTENDED STORAGE USING STORED PROCEDURES

As mentioned earlier, the purpose of adding the DT option to work alongside DCC, is to be able to migrate ‘warm’ historical data from DCC in memory tables to extended tables in DT.  Data that is used for daily reporting and other high-priority data is ‘hot’ data and must stay in in memory. However, ‘warm’ data is other data required to operate applications, and so since that data isn’t used as frequently as ‘hot’ data, it can be moved to extended storage.  In this case, the historical data is considered as the ‘warm’ data that is sitting in DCC. Therefore, we would want to select the historical (warm) data and move it to DT tables.  To migrate selected historical data from in-memory to extended storage, we will be using a stored procedure. In this case the stored procedure is called “Migrate_Aged_Orders_1”. Once this set of SQL statements are run, any data that is older than the specified date in the variable, varAgedDate, will be inserted into the extended table that you created in the previous section. It will then be deleted from the in-memory table; this deletion of the data transferred is necessary so that duplicate entries do not occur.

Complete the following:

  1. Right click on your SYSTEM user.
  2. Click “Open SQL Console.
  3. Copy and paste the following script onto the console.





  4. Click the Deploy icon.
  5. Left click on your SYSTEM user.
  6. Click log off.
  7. Double click on your SYSTEM user to restart the user.
  8. Click SYSTEM > Catalog > SAP_HANA_DBCC > Procedures.  You will find the procedure you just created saved here. Every time you would like to run this procedure, you will have to manually call it. Proceed with the instructions to complete this step.
  9. Right click on your SYSTEM user.
  10. Click "Open SQL Console".
  11. Copy and paste the following script onto the console.

    CALL "SAP_HANA_DBCC"."Migrate_Aged_Orders_1" ();
  12. Click the Deploy icon.

Note: This procedure will only be migrating all historical data from days prior to the day before the current date; this is just for illustrative purposes. In practice, the procedure would be set so that the value of the varAgedDate variable aligns better to your organization’s data storage policy (e.g. keep the latest 3 months in main memory and move everything else to extended storage).

VERIFY DATA PARTITION

After executing the Migrate_Aged_Orders_1 () stored procedure, you can run the following SQL statements to verify that the historical data is now partitioned accordingly between the in-memory and dynamic tiering tables.

      To verify that the data that you have selectively migrated to the extended storage, execute the following queries:

      SELECT * FROM "SAP_HANA_DBCC"."test" order by "TIMESTAMP" asc;

      SELECT * FROM "SAP_HANA_DBCC"."test" order by "TIMESTAMP" desc;

      SELECT * FROM "SAP_HANA_DBCC"."test_DT" order by "TIMESTAMP" asc;

      SELECT * FROM "SAP_HANA_DBCC"."test_DT" order by "TIMESTAMP" desc;

      The results that we get after running the two SQL statements above are illustrated below.  There are screenshots of each table’s results in ascending and descending order, to show how much of the data has been deleted and moved from the ‘test’ table to the ‘test_DT’ table.   In this example, you will notice that the earliest data was collected on October 28th, 2015 and the latest data was collected on November 18th, 2015. Since the variable, ‘varAgedDate’, was set to one day, you will notice from the screenshots the data left in the ‘test’ table has the latest data record from November 18th 2015 9:08PM and the earliest data record for 12:00AM.  The ‘test_DT’ table has the latest data record from November 17th, 2015 and the earliest data record for October 28th, 2015.

      The “test” in-memory table results:


      The "test_DT" extended table results:


      You can also check this by right clicking on the TESTING_HISTORICAL_DT table in the catalogue and choosing Open Data Preview from the popup menu.

      Alternatively, you can issue SQL statements to retrieve the row count of the different tables (the counts should change over time):

      SELECT COUNT(*) FROM "SAP_HANA_DBCC"."test";

      SELECT COUNT(*) FROM "SAP_HANA_DBCC"."test_DT";

      For illustration purposes, if we were to run just the first COUNT statement above, we would get the following result, which shows that there are 994 rows in the test table after the migration process. Also to confirm that this is correct, if we were to go back and run the SELECT * FROM "SAP_HANA_DBCC"."test" order by "TIMESTAMP"; like in the second screenshot, we can see from the first column and last row that 994 rows are left in the ‘test’ table.

      Running:

      SELECT COUNT(*) FROM "SAP_HANA_DBCC"."test";

      Running:

      SELECT * FROM "SAP_HANA_DBCC"."test" order by "TIMESTAMP";

      SUMMARY

      The instructions above show you how to migrate historical data from the in-memory tables in DCC to the extended storage in DT.  The procedure allows you to keep the most up-to-date historical data (hot) in memory while moving the older data (warm) to expended storage.

      To summarize the steps:

      1. Adjust the purge settings to an artificially large number so that you will not have to continuously monitor your data.
      2. Create extended tables in Dynamic Tiering with the same columns that you want to move data from.
      3. The migration process will then take place by calling a stored procedure to move selected data from the in memory table to the extended table, after which you will verify your results.

      After completing these steps, you should have a table in DT with the historical data which include the earliest row of data from the last purge session, to the latest row of data based on the variable date that you noted in the stored procedure. The table in DCC should only have whatever is left over after the data has been moved over. After the data transfer process is complete, the result will include the two tables with the selected historical data migrated successfully.

    Tags:
    Former Member

    No comments