cancel
Showing results for 
Search instead for 
Did you mean: 

Insert fails

Former Member
0 Kudos

I have a column table (having approx 64 columns and about 56113305 rows). When I try to make a copy of this table using the "Create Column table "table" like "table2" with data",

it fails giving me the following exception.

"ims_search_api/RowIdValueColumn.cpp:466

exception in RowIdValueColumnSource"

I can not think of any reason except that there must be some maximum number of rows with respect to insert that I am breaking?

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi Muhammad,

Yes that is indeed a very large table. Can you attach a screen shot of the full error so we can rule out thinsg like authorisations etc. Also instead of trying to create using SQL can you first try the wizard. EG right click on the table > select generate > select use table as template. This should help you quickly create a copy. Activate then use the same wizard path but select generate > insert statement to auto format your insert statement.

My guess is that if its not a limit or authorisation it is a manual error when generating the required insert statement which must accomodate for all columns EG Insert (1,2,3,4,5,6,7,8,9, etc.)

Try these options. If you can insert with this method without an error you will know that the limit assumption is not the issue.

Kind regards,

Danielle 

Former Member
0 Kudos

This time I created the table using your approach ("template"). Plus this time I made the table row based instead of column based. Just to give you a brief history, when I inserted in the table(approx 64 columns and about 56113305 rows) mentioned above I had to do it in batches of (5000000). The tables from which these batches were pulled are in the same schema.Even then as the number of rows increased the inserts started getting hanged. I had to restart my AWS instance several times to get all the data in one table. Could this be an issue with system running out memory?

Former Member
0 Kudos

Hi Muhammad,

I have repeated a test example on my system using a very large table MARC. I was able to do a select all insert in

successfully executed in 4:28.705 minutes - Rows Affected: 7051024

I had no batch loading necessary. This was done internally in HANA studio. My guess is yes its your memory. HANA executions are affected by personal memory. My team has worked on laptops and the performance of simple data preview fetch has varied significantly. Check your memory and monitor. Ive got 248M on my system. You can see the monitor on the lower right hand side of the studio.

Kind regards,

Danielle

Former Member
0 Kudos

Hi Danielle,

First of all Thanks a lot for trying it out . I think I need to clarify a few points. My HANA db is hosted on AWS (Amazon cloud). My laptop just has the client. So I dont think so that my laptop's memory will be an issue. I think that due to huge volume of data in the HANA db, the db server might be having some memory issues.

Secondly When I insert batches the earlier batches got inserted pretty quickly and also were of sizes greater than 5000000 rows but after the first few batches were inserted the later would cause problems.

Plus I was not able to find the system monitor in the client studio. Can you kindly assist me on how to open one.

Thanks

Moaz

Former Member
0 Kudos

Hi Muhammad,

To answer your initial question then I think we can conclude that your issue is not a maximum number of row insert with respect to your SQL insert.

I am not able to replicate your issue unfortunately but if I had to guess it definitely looks like memory. Whether its on the server side of your local machine side the best way to sort your issue is to now monitor the logs to see where the bottleneck is.

Please let me know if you sort the issue as i hope this has helped some.

Kind regards,

Danielle

rama_shankar3
Active Contributor
0 Kudos

Muhammad:

Does your table have partitions?

Regards,

Rama

Former Member
0 Kudos

No I have not used them.