cancel
Showing results for 
Search instead for 
Did you mean: 

Time travel needs enhancing

Former Member
0 Kudos

I have been investigating the ‘time travel’ concept as it potentially looked very useful for reporting on historical master data.

However having read the thread http://scn.sap.com/message/14197641 by Justin Molenaur and the responses by Lars I was very disappointed.

Essentially Time Travel can do exactly what you want it to do if you manually update the data (remember to turn off auto commit) and use SQL to interrogate it.

(see the SAP technical paper – “ SAP HANA - Time Travel Example” by Jack Boers)

All fine but of little use. What we need to be able to do is use time travel in the HANA models to enable queries to run and display master data as of a point in time.

SAP need to urgently enhance this feature to integrate it with SLT and allow Analytic/Calculation views  to make use of it.

Lars mentioned “it looks as if you try to replace the complex data staging/loading handling present in SAP BW by something much simpler and by sneaking in a kind of time-dependency through usage of the history table feature” – and of course that is EXACTLY what we do want to do, not in a sneaky way but in a way supported by SAP – it should be very simple to do.

SLT does capture all changes in the source system so why not use these changes to populate a history table so that you can query it with your HANA models???  The response to use BODS etc is just not acceptable. We use SLT precisely because it is real time and does capture these changes. It should not need to do any transformation, it just replicates the ECC changes to HANA where it should be possible to store them in a history table.

SAP need to

1)    1)  Enable SLT to populate history tables (currently SLT can replicate to a history table but the commit id’s are not maintained)

2)     2) Enable Analytic and calculation views to query the history tables

Why should we be forced to have to use BODS and somehow update the valid to date of the latest existing version of a record when we get a new one coming through? So much effort for every SAP customer to try and re-invent the BW time dependent master data concept in HANA when SAP could with a few simple extensions actually make time travel useful.  So over to SAP - how about it?

Andy

Accepted Solutions (0)

Answers (1)

Answers (1)

Former Member
0 Kudos

Hi Andy,

Your frustrations are not the first nor will they be the last. Agreed that the product landscape should support this fundamental feature that's found in most data mart/warehouse systems.

In the mean time though, here are a few options for you to try. We're currently working on wrapping up a paper to discuss these approaches in more detail.

Triggers

Create 3 triggers on the your master data table in HANA, one for INSERT, one for UPDATE, one for DELETE. Within the triggers, write the appropriate code for maintaining your history tables. Then, model your historical master data tables against Data Foundation in Analytic Views via temporal join.

Pros

1) History tables are maintained in "real-time".

Cons

1) Triggers on tables can be quite performance intensive on large INSERT/UPDATE/DELETE operations.

2) Maintenance of triggers. I do not believe this can be maintained by SLT, so need to be careful about creating triggers before SLT initial loads.

3) Similar to above, this is a bit labor intensive to write all this code for all relevant tables.

SLT configuration

Track timestamp and operation codes via SLT. This can be accomplished via Table deviation using Edit Table Structure option in IUCC_REPL_TABSTG in SLT. Add a field ZZTIMESTAMP and ZZSLT_FLAG. Next, define a transformation rule in

Former Member
0 Kudos

Forgot to list the pros and cons.

Pros

1) Avoid headache of maintaining triggers.

2) Avoid bad performance of triggers.

3) SLT configuration can be re-applied on other tables.

Cons

1) This 'batch' approach doesn't allow for real-time updates on history table.

2) Procedure needs to be maintained and scheduled.

Also, the modeling approach is the same as the first approach - they just differ in how to populate history tables.

Former Member
0 Kudos

Jody,

These are all interesting ideas and I would be interested to see the paper you are writing, but they do seem to be resource intensive when really this should be a supported feature out of the box.

The beauty of a history table is that you can select from it as though it were a normal table (ie not specify the commit id or timestamp) and it will give you the latest value, only in the models where you need historical values do you need to specify the timestamp.

I will chase via other routes the HANA development to see if we can get a commitment to supporting this in a future revision.

Andy

Former Member
0 Kudos

Hi Andy,

I wouldn't say they're much more resource-intensive than setting up similar functionality in an ETL tool. A bit more work, but the SLT approach can be reused on other tables.

Yes, you would need to create new stored procedures for each table (or modify the same one to handle multiple tables), but I would think you could actually handle this in a clever fashion via dynamic SQL, just passing in table names as a parameter, called from a wrapper procedure. I may try this actually.

Speaking of the stored procedure - it could easily be modified to add a "CURRENT_FLAG" field which tracks the current value. So if you want to report only against this, you just filter the attribute view on this column and join to your fact table on just the key fields rather than a temporal join.