cancel
Showing results for 
Search instead for 
Did you mean: 

Chasing documentation for Smart data access

Former Member
0 Kudos

Hey all,

currently trying to use HANA smart data access connecting to an Oracle Database and wanting documentation on what effect settings in the ini have

enable_remote_cache

enable_remote_source_capability

remote_cache_validity

semi_join_execution_strategies

semi_join_max_in_elements

semi_join_max_temp_table_cardinality

virtual_table_default_cardinality

virtual_table_format

virtual_table_threshold

a few like caching are obvious, others like semi_join_execution_strategies are less obvious with possible values and effects.

Any help would be welcome

Accepted Solutions (1)

Accepted Solutions (1)

former_member183326
Active Contributor
0 Kudos

Here is what I could find.

enable_remote_cache - Enables remote caching on hive like systems. Specifies if query results can be cached in remote systems in order to avoid repeated map reduce jobs on Hadoop Cluster

enable_remote_source_capability - Enables remote caching on hive like systems. Specifies if query results can be cached in remote systems in order to avoid repeated map reduce jobs on Hadoop Cluster

remote_cache_validity - Validity period of the remote cache in seconds. Specfies how long the cache can stay valid. Once expired, it will be recreated with updated content

semi_join_execution_strategies - Specifies the preferred order of semi-join execution strategies.

semi_join_max_in_elements - Specifies maximum number of values in the IN clause for semi-join usage.

virtual_table_format - Forces optimizer to use between column or row-based operators

semi_join_max_temp_table_cardinality - Maximum number of values to be inserted in a semi-join temp table.

virtual_table_default_cardinality- Default cardinality for virtual table with statistics unavailable. Default cardinality for virtual table with statistics unavailable

virtual_table_threshold - Maximum amount of memory required to fit the result of a remote query into an itab structure. Maximum amount of memory required to fit the result of a remote query into an itab structure.

Former Member
0 Kudos

Thanks for help will try the internal forums for the expanded problem.

Basically we have a number of tables which represent user groups, user hierarchies and user info for the purpose of security.

We need to keep these in sync between two systems (HANA/Oracle).

History:

Original solution was to replicate the data however the source system truncates and rebuilds the tables quite rapidly overloading the replication system and leading to periods of them being out of sync

So we have decided to trial the smart data access system by having those tables in Oracle virtualized and HANA just query direct. Initial testing is showing the without join relocation turned on the performance is horrible and no wonder it is pulling tonnes of records back from Oracle.

Turning join_relocation on there is a bit jump in performance specially around memory usage. However there is a new problem in that it creates TEMP tables so immediately needs much higher privileges but more importantly it doesn't seem to get the data types right all the time.

Basically it seems to quite intelligently identify hey I can bundle up some of the non virtualized table's results remotely push that to Oracle to filter the results by before bringing them back improveing performance however we are getting

SAP DBTech JDBC: [403]: internal error: Failed to execute create and insert statement: [Oracle][ODBC][Ora]ORA-01727: numeric precision specifier is out of range (1 to 38)

and on inspection of the table it is creating see below it doesn't always get the datatypes right.

65535 --> max value for 16bits

Oracle being run was retrieved from smart data access in hana studio not Oracle auditing

CREATE GLOBAL TEMPORARY TABLE JRT_1_0X7FD550F57490 (

                "C1" number(38, 0)

                ,"C2" TIMESTAMP

                ,"C3" TIMESTAMP

                ,"C4" number(38, 0)

                ,"C5" nvarchar2(512)

                ,"C6" nvarchar2(512)

                ,"C7" number(38, 0)

                ,"C8" number(38, 0)

                ,"C9" nvarchar2(100)

                ,"C10" TIMESTAMP

                ,"C11" TIMESTAMP

                ,"C12" number(38, 0)

                ,"C13" nvarchar2(1)

                ,"C14" number(38, 0)

                ,"C15" number(38, 0)

                ,"C16" TIMESTAMP

                ,"C17" TIMESTAMP

                ,"C18" NCHAR(1024)

                ,"C19" nvarchar2(100)

                ,"C20" number(38, 0)

                ,"C21" nvarchar2(100)

                ,"C22" nvarchar2(1024)

                ,"C23" nvarchar2(128)

                ,"C24" nvarchar2(128)

                ,"C25" TIMESTAMP

                ,"C26" TIMESTAMP

                ,"C27" nvarchar2(1024)

                ,"C28" nvarchar2(1024)

                ,"C29" nvarchar2(1024)

                ,"C30" nvarchar2(1024)

                ,"C31" number(65535, 0)



So basically trying to get my head around how those dials affect when it chooses start pushing data remotely for the purposes of filtering

1. Always push so I can try and isolate which column/datatype it can't translate

2. Only push filters directly on the virtual tables and the joins between them (VT tables are block together in the query always) to avoid the problem.

Answers (1)

Answers (1)

lbreddemann
Active Contributor
0 Kudos

As an SAP employee, you might be better off asking for the explanation of not documented parameters in an SAP internal forum or directly from the development team.

One thing I can tell you right away is that guessing what a parameter does (even the "obvious" ones) usually doesn't get the correct answer.

enable_remote_cache -> creates a cache on the remote server? creates a cache that is rather remote? caches data for access by remote systems? ...