cancel
Showing results for 
Search instead for 
Did you mean: 

Predictive Analytics to SAP HANA wrong data type

Former Member
0 Kudos

Hello all.

We have a problem when we want to use PAL/R libraries on SAP HANA called directly from PA 2.5.

PA want to create table types on HANA, but it uses type String for some columns. HANA doesn't support String. PA should use VARCHAR. So this operation ends with SQL error.

This is probably problem with data types mapping, but I can not find any topic on this (SAP notes or internet discussion).

Has someone this issue also?

Full SQL PA want to run on HANA is below.

Thanks.

Peter

CREATE TYPE PAS12_READER_0_TYPE AS TABLE ("row_id"  INT , "AGE_INSURED_ANALYSIS_DATE_1" DOUBLE ,"AGE_POISTENY_PLANOVANE_DOZITIE" DOUBLE ,"AGE_POISTENY_PRI_PODPISE" DOUBLE ,"AGE_POISTNIK_PLANOVANE_DOZITIE" DOUBLE ,"AGE_POISTNIK_PRI_PODPISE" DOUBLE ,"AGE_POISTNIK_DATUM_ANALYZY" DOUBLE ,"COUNT" INTEGER ,"DEAD_1" DOUBLE ,"DOBA_DO_DOZITIA" DOUBLE ,"DOBA_TRVANIA" DOUBLE ,"DURATION" DOUBLE ,"FLAG_INSURER_INSURED_1" DOUBLE ,"LIVE" DOUBLE ,"PLANOVANA_DOBA_TRVANIA" DOUBLE ,"PREMIUM_1" DOUBLE ,"PREMIUM_COUNT" DOUBLE ,"PREMIUM_TERM_1" DOUBLE ,"PRODUCT01" DOUBLE ,"PRODUCT02" DOUBLE ,"PRODUCT04" DOUBLE ,"PRODUCT05" DOUBLE ,"PRODUCTS" DECIMAL ,"STATUS_1" DOUBLE ,"STORNO_ENDOWMENT" DOUBLE ,"STORNO_JUBILEE_EVENT" DOUBLE ,"STORNO_NON_PAYMENT" DOUBLE ,"STORNO_OTHERS" DOUBLE ,"STORNO_TERMINATION" DOUBLE ,"TYPSCH_ID" DOUBLE ,"DOZITIE" DOUBLE ,"LAT_KRAJ" DOUBLE ,"LAT_OKRES" DOUBLE ,"LONG_KRAJ" DOUBLE ,"DISTRICT" VARCHAR (255)  ,"LONG_OKRES" DOUBLE ,"PLAT_PERIODA" DOUBLE ,"COUNTRY" VARCHAR (255)  ,"POCET_POIST" DOUBLE ,"POCET_PREDPISOV" DOUBLE ,"POCET_PRIPOISTENI" DOUBLE ,"POCET_UPOMIENOK" DOUBLE ,"AGE_INSURED_ANALYSIS_DATE" INTEGER ,"AGE_INSURED_CONTRACT_DATE" INTEGER ,"AGE_INSURED_PLAN_SURVIVAL" INTEGER ,"REGION.description" String (255)  ,"AGE_INSURER_ANALYSIS_DATE" INTEGER ,"AGE_INSURER_CONTRACT_DATE" INTEGER ,"AGE_INSURER_INTERVAL_ANALYSIS_DATE" VARCHAR (255)  ,"AGE_INSURER_PLAN_SURVIVAL" INTEGER ,"ALL_DATA" INTEGER ,"AUTONUMBER" INTEGER ,"BIRTHDATE" DATE ,"BIRTH_DATE" DATE ,"COUNTRY_NAME.description" VARCHAR (255)  ,"DEAD" DOUBLE ,"DISTRIBUTION_CHANNEL" INTEGER ,"DISTRICT_NAME.description" VARCHAR (21)  ,"DURATION_IN_MONTHS" INTEGER ,"DURATION_IN_YEARS" DOUBLE ,"DURATION_IN_YEARS_ROUNDED" INTEGER ,"ENDOWMENT_ASSURANCE" DOUBLE ,"ENDOWMENT_TERM" INTEGER ,"END_DATE" DATE ,"FLAG_INSURER_INSURED" INTEGER ,"FLAG_LAPSE" VARCHAR (1)  ,"GENDER" VARCHAR (4)  ,"GENDER_INS" VARCHAR (1)  ,"COUNTRY.description" String (255)  ,"COUNTRY.description" VARCHAR (255)  ,"DISTRICT.description" String (255)  ,"DISTRICT.description" VARCHAR (21)  ,"GEOGRAPHY_OKRES_LONGLAT_KEY_3" VARCHAR (257)  ,"GEOGRAPHY_OKRES_LONGLAT_KEY_4" VARCHAR (257)  ,"GEOGRAPHY_OKRES_LONGLAT_KEY_5" VARCHAR (257)  ,"DISTRICT_LATITUDE" DOUBLE ,"DISTRICT_LONGITUDE" DOUBLE ,"COUNTRY_FEATUREID" VARCHAR (255)  ,"COUNTRY_LATITUDE" DOUBLE ,"COUNTRY_LONGITUDE" DOUBLE ,"OKRES_LONGLAT_REGION" String (255)  ,"REGION_FEATUREID" VARCHAR (255)  ,"REGION_LATITUDE" DOUBLE ,"REGION_LONGITUDE" DOUBLE ,"REGION.description" VARCHAR (255)  ,"HIERARCHY_KEY_2" VARCHAR (257)  ,"HIERARCHY_KEY_3" VARCHAR (255)  ,"SKUP_PRODUKTOV_TXT_2" VARCHAR (255)  ,"PRODUKT_ID_2" INTEGER ,"INSURANCE_STATUS" VARCHAR (3)  ,"INSURED_BIRTHDATE" DATE ,"INSURED_COUNTRY" VARCHAR (4)  ,"INSURED_COUNT" INTEGER ,"INSURED_GENDER" VARCHAR (4)  ,"INSURED_ZIP" VARCHAR (7)  ,"LAST_MONTH" INTEGER ,"LAST_QUARTER" INTEGER ,"LAST_YEAR" INTEGER ,"LAT_DISTRICT" DOUBLE ,"LAT_REGION" DOUBLE ,"LIFE_STATUS" INTEGER ,"LIFE_STATUS_TXT" VARCHAR (43)  ,"LIVE_TRUE" INTEGER ,"LONG_DISTRICT" DOUBLE ,"LONG_REGION" DOUBLE ,"PAYMENT" VARCHAR (2)  ,"PAYMENT_TXT" VARCHAR (21)  ,"PAY_END_DATE" DATE ,"PAY_PERIOD" INTEGER ,"PERIOD" VARCHAR (13)  ,"PLAN_TAKE_TIME" INTEGER ,"PREMIUM" DOUBLE ,"PREMIUM_TERM" DOUBLE ,"PRESCRIPTION_COUNT" INTEGER ,"PRODUCT01_TRUE" INTEGER ,"PRODUCT02_TRUE" INTEGER ,"PRODUCT04_TRUE" INTEGER ,"PRODUCT05_TRUE" INTEGER ,"PRODUCT_GROUP" VARCHAR (47)  ,"PRODUCT_ID" INTEGER ,"REGION_NAME.description" VARCHAR (255)  ,"REMINDER_COUNT" INTEGER ,"RIDER_COUNT" INTEGER ,"START_DATE" DATE ,"STATE" VARCHAR (4)  ,"STATUS" VARCHAR (24)  ,"STAT_ID" VARCHAR (2)  ,"STORNO_DATE" TIMESTAMP ,"STORNO_ENDOWMENT_TRUE" INTEGER ,"STORNO_JUBILEE_EVENT_TRUE" INTEGER ,"STORNO_NON_PAYMENT_TRUE" INTEGER ,"STORNO_OTHERS_TRUE" INTEGER ,"STORNO_TERMINATION_TRUE" INTEGER ,"SWITCH_OF_PERIOD" VARCHAR (255)  ,"TIME_STORNO_DAT_DAY" TINYINT ,"TIME_STORNO_DAT_MONTH" TINYINT ,"TIME_STORNO_DAT_QUARTER" TINYINT ,"TIME_STORNO_DAT_DATE_SQL" TIMESTAMP ,"TIME_STORNO_DAT_YEAR" INTEGER ,"ZIP" VARCHAR (5)  );

Accepted Solutions (0)

Answers (1)

Answers (1)

abdel_dadouche
Active Contributor
0 Kudos

Hi,

Can tell me what is the original data type for the following columns used in your Auto Classification:

- COUNTRY.description

- REGION.description

- DISTRICT.description

- OKRES_LONGLAT_REGION

Thanks

@bdel

Former Member
0 Kudos

Hello Abdel.

I hope I will provide necessary information:

1. We use Lumira Desktop to publish data to HANA DB (scheme on HANA XS trial)

2. Lumira creates Analytical view also.

3. In published data (HANA), data types are:

- COUNTRY.description VARCHAR(30)

- DISTRICT.description VARCHAR(21)

- REGION.description VARCHAR(16)

- DISTRICT.description VARCHAR(21)

- OKRES_LONGLAT_REGION VARCHAR(2)

3. We connect PA to the same HANA DB and run PLA/R functions on it.

4. PA will send wrong SQL to HANA as part of the processing

One thought: string is possible as column store data type in HANA. But it should not be used in SQL table type definition.

Peter

abdel_dadouche
Active Contributor
0 Kudos

Hi,

Thanks for your prompt response.

I saw that for example ZIP is properly handled as VARCHAR (5). So there must be something in the Analytical view definition causing this

Would you be able to provide the definition of the generated Analytical view (for example using the auto-documentation export feature)?

Also, as you are trying to use the "Auto-Classification", why not try the Automated Analytics/Modeler instead of the Expert Analytics?

The following link points to.

Regards

@bdel