cancel
Showing results for 
Search instead for 
Did you mean: 

HCI-DS: Cannot view data for target datastore

Former Member
0 Kudos

Hi,

We’re trying to upload customer master from a flat file to S&OP On Demand using HCI-DS.

After execution, task status is set to “Completed successfully” yet display of raw data for target datastore ends up with error message “Unable to retrieve data from the table. Reimport the table and try again. Exception while getting row count: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [259]: invalid table name: Could not find table/view SOPMD_STAG_ZULTCUSTOMER in schema SAPSOPINTEG”.

Any clue as to what could be the rootcause?

Bernard

Accepted Solutions (1)

Accepted Solutions (1)

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

If the status is Completed successfully, it is very like that data loaded into the staging table to be deleted (once the S&OP stored procedure is finished). But this would not generate the error you mentioned, you would simply not see any data linked to the job you ran.

Which is the name of the customer master data table in SOP? Could you just try and re-import it?

Former Member
0 Kudos

Hi Alecsandra,

Exactly, plus the value for attribute Date_last_loaded is set to “NotYetLoaded”, which seems odd. Target customer table name is SOPMD_STAG_ZULTCUSTOMER. I did try to reimport the data, same result.

There is an entry in the error log for the task : “A row delimiter was seen for row number <1> while processing column number <10> in file <C:/Data/KNA1.csv>. The row delimiter should be seen after <169> columns. Check the file for bad data, or redefine the input schema for the file by editing the file format in the UI”. Could that be related?

Bernard

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

The error from log indicates that the csv file you`re trying to import doesn`t match with the format file defined at data store level. Perhaps you could check: number of columns, delimiter used...

Former Member
0 Kudos

I've deleted the unnecessary columns from source datastore and reimported the data, still getting error message "Unable to retrieve data from the table. Reimport the table and try again. Exception while getting row count: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [259]: invalid table name: Could not find table/view SOPMD_STAG_ZULTCUSTOMER in schema SAPSOPINTEG".

Task error log is empty.

Former Member
0 Kudos

Hi Alecsandra,

I’ve created another case, this time to load the Event Master in a different planning area and ended up with the same result.

Here’s what I did:

  • In source datastore, created file format from scratch and added columns for source fields;
  • In target datastore, created table by importing object SOPMD_STAG_ULTEVENT from Master Data Folder;
  • Created task by copying from template SOP_File_Task;
  • Added target object by importing SOPMD_STAG_ULTEVENT;
  • In DF editor, added source file and query to extract from source file; mapped fields from source file to extract query, and from extract query to target query; in the execution properties, changed global variables $G_PLAN_AREA to point to the proper planning area; at validation, a warning was issued for some fields not being mapped to target query, but as mentioned in this tutorial it should be fine;
  • Ran the task, did not get any entry in the error log;
  • Trying to view the staging table content, got error msg Unable to retrieve data from the table. Reimport the table and try again. Exception while getting row count: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [259]: invalid table name: Could not find table/view SOPMD_STAG_ULTEVENT in schema SAPSOPINTEG.

Any clues as to what could cause the problem?

Thanks,

Bernard

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Bernard,

I had the same issue. Seems to be something temporary that we are not able to view cloud tables. Should be fixed by next week.

Rgds,

Alecsandra

Former Member
0 Kudos

Just tested, it's fixed now, thanks.

Former Member
0 Kudos

Hi,

We have successfully loaded data in staging table, but cant see data in core table.

Job in IBP shows success status. Any idea what could be the issue??

Source: ECC

Target: Staging Table

From staging table we need data in Core Table.

Thanks,

Purav

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Purav,

Once the data passes the integrity check in S&OP, it is moved from the target table into the core table.

Did you check the upload report from S&OP end?  Are the records being displayed as successfully loaded?

Regards,

Alecsandra

Former Member
0 Kudos

Hi,

I have checked Monitor Log & it shows correct no of records. I also verified data in Staging table with ECC and it is as per expectation.

I am not sure about which report you are asking?? In IBP I checked Data Integration tab, and all the columns are blank for corresponding data load list.

Thanks,

Purav

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

What do you try to load in IBP? Is KF data - if yes, did you change the planning area from the execution properties? Did you provide the correct Filename in the target query? I am asking this, as these are the most frequent root causes when you see data successfully in HCI and an empty log in IBP.

Perhaps you can provide a screen shot from IBP - data load tab (where the job appears).

Regards

Alecsandra

Former Member
0 Kudos

Hi Alecsandra,

I have loaded Customer Master.

Planning Area was changed. I am not loading any data through IBP, loading it using Task in HCI. But when we run task in HCI, we also see its log in IBP. Attaching screenshot for task in HCI & log of IBP.

HCI Screenshot:

Total 353 records are available in Staging table, I have verified it with ECC.

IBP Screenshot:

Area marked in red is always blank. But status is always green.

Note: These are jobs auto created by system while running task in HCI.

Ignore Time stamp as HCI is on UTC and IBP is on local time.

Thanks,

Purav

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Purav,

When you load Master Data, planning area is not relevant as MD can be shared across multiple models.

What I see from your print screen is that FILENAME is missing.

Could you please check whether FILENAME is mapped in HCI Target Query?

Regards

Alecsandra

Former Member
0 Kudos

Hi,

I couldn't find any place where I can map filename.

May be that's the issue. Could you please send screenshot for this setting??

Thanks,

Purav

Former Member
0 Kudos

Configured DataFlow:

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

See below.

Go in the Target Query (MapToTarget) and ensure that FILENAME contains stag table name.

In below example FILENAME mapping is: 'SOPMD_STAG_SM1CUSTOMER'. You must ensure that you have your own Customer MD table between the ''.

Regards

Alecsandra

Former Member
0 Kudos

Ok found mapping for FileName, I have changed it to the correct table. But IBP still shows blank filename.

Former Member
0 Kudos

Hi,

Do we need to specify staging table or target table name in filename??

Thanks,
Purav

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

You need to provide staging table. Checking your print screen you have to map with 'SOPMD_STAG_NOWCUSTOMER'

Former Member
0 Kudos

Hi,

Thank you very much. It did work.

Couple of questions for HCI:

  • How do we handle delta in HCI, as I dont see any option where I can mark delta / full in data flow.
  • Whats the use of Global Variable declaration?? How does it impact??
  • Is there any guide which allows me to learn how to write scrips preload and postload. I would like to learn syntax.

Thanks,
Purav

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos
  • The Upload mode options you have are those from S&OP: replace, insert_update and delete. HCI supports delta extractors from SAP business suite. If there`s no delta queue in the source, there is no straight forward way to achieve this. You could constrain your job so that it will pick only desired entries: for example, if you have a date column in your source table you could create a global variable for the last extract date, filter your entries where date > last extract date..
  • Global variables are used to parameterize your tasks, you can change these variables without changing your data flow...Some of the global variables are used by IBP to process the data after it is loaded, like the plan_area. Perhaps you can check below doc for more details http://help.sap.com/businessobject/product_guides/hci1/en/hci10_integration_ibp_en.pdf
  • I don`t know where you could find more info, maybe checking on premise data services guide would help

Regards

Alecsandra

Former Member
0 Kudos

Hi,

Thank you very much.

I was also assuming that delta will be done only by such date columns and use of filters.

Once again thanks for all the support.

Thanks,

Purav

Former Member
0 Kudos

Hi,

We created DF for loading data into KF Staging table. Data is getting loaded in Staging tables, but we are unable to find / load data in core table. Also, we cant see core table for KF while importing table in DataStore. Any idea on how to proceed with these transaction data??

Thanks,

Purav

Alecsandra
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Purav,

After S&OP integrity check, data is being moved from the staging table into the core tables.

Once the data has reached the core tables, you are not any longer able to view it from HCI.

Did you check S&OP log? If records are successful, you should be able to download them in a planning view.

Regards,

Alecsandra

mahlako
Explorer
0 Kudos

Hi Alecsandra,

We are able to view the data in the target data store, the staging table,however CPI-DS is still executing post processing procedures to the IBP Instance and the tasks fail in CPI-DS after the time lapses but remain in "processing" state in IBP data integration app.

We tried importing data using the data integration app of the Web UI and no data is being transferred to the core tables instead the integration jobs have remained on "processing' for the passed few days. We executed a purge data import batches job and we still having the same problem.

Is there any global configuration setting we need to do on our ibp system to ensure that data is integrated and processed into our core tables?

Thank you kindly

Malebo

Answers (0)