cancel
Showing results for 
Search instead for 
Did you mean: 

error while importing .csv file

hardik_patel
Participant
0 Kudos

Hi,

I tried to import  flat .csv file from HANA studio. I follow http://scn.sap.com/docs/DOC-27960

I have HANA Studio with  1.0.29 and HANA Database with 1.0.26.

See the below error:

My Log File:

<status>ERROR</status>

<info>Data Load - Batch Information</info>

<message>

<status>ERROR</status>

<info>Batch from Record 1 to 3401 Failed: [301]: unique constraint violated(input position 482)</info>

<error>com.sap.db.jdbc.exceptions.BatchUpdateExceptionSapDB: [301]: unique constraint violated(input position 482)

<br/> at com.sap.db.jdbc.CallableStatementSapDB.executeBatch(CallableStatementSapDB.java:681)

<br/> at com.sap.db.jdbc.trace.PreparedStatement.executeBatch(PreparedStatement.java:615)

<br/> at com.sap.ndb.studio.bi.filedataupload.deploy.populate.PopulateSQLTable.populateTable(PopulateSQLTable.java:89)

<br/> at com.sap.ndb.studio.bi.filedataupload.deploy.job.FileUploaderJob.uploadFlatFile(FileUploaderJob.java:186)

<br/> at com.sap.ndb.studio.bi.filedataupload.deploy.job.FileUploaderJob.run(FileUploaderJob.java:59)

<br/> at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)

<br/></error>

</message>

I am new to HANA Studio and with HANA Database. Can anyone help?

Thanks for your time.

Regards,

Hardik

Accepted Solutions (1)

Accepted Solutions (1)

hai_murali_here
Advisor
Advisor
0 Kudos

Hi Hardik,

As Rama mentioned,if you are storing the data in Column store table,then you need to make sure that there are no duplicate entries exist for the Primary Key column.

I guess your input file has the duplicate records for the primary column field.Just remove those duplicate records and then try to load.

Rgds,Murali

Former Member
0 Kudos

Hi Hardik,

Please take care of data type also.Example date,qty etc

hardik_patel
Participant
0 Kudos

Thanks to all for your reply. I resolved the issue.

I want to practice with information views and I just have one .csv file. Does any have .csv file that i can use?

regards,

Hardik

Answers (5)

Answers (5)

Former Member
0 Kudos

hi

while importing i converted csv file to excel file but still the following error.plz help me with that

Batch from Record 2 to 2363 Failed

java.lang.NullPointerException

          at com.sap.db.jdbc.CallableStatementSapDB.makeBatchCountArray(CallableStatementSapDB.java:709)

          at com.sap.db.jdbc.CallableStatementSapDB.executeBatch(CallableStatementSapDB.java:676)

          at com.sap.db.jdbc.trace.PreparedStatement.executeBatch(PreparedStatement.java:615)

          at com.sap.ndb.studio.bi.filedataupload.deploy.populate.PopulateSQLTable.populateTable(PopulateSQLTable.java:98)

          at com.sap.ndb.studio.bi.filedataupload.ui.job.FileUploaderJob.uploadFlatFile(FileUploaderJob.java:197)

          at com.sap.ndb.studio.bi.filedataupload.ui.job.FileUploaderJob.run(FileUploaderJob.java:60)

          at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)

Former Member
0 Kudos

hi

am importing table as a row store type for which we dont need primary key.

thanks

nisha

Former Member
0 Kudos

Hi

i have problem with importing csv file into sap hana studio.

it shows data load aborted.

plzz tell me the solution

thanks

nisha

former_member182277
Contributor
0 Kudos

Hello Hardik,

Problem might happen when the length of inserted record is larger than the length mentioned for the particular column.

Please check the datatype and their length too..

Regards,

neha

hardik_patel
Participant
0 Kudos

Thanks to all for your reply. I resolved the issue.

I want to practice with information views and I just have one .csv file. Does any have .csv file that i can use?

regards,

Hardik

Former Member
0 Kudos

Hello Hardik,

I just have the same problem. I tried to import a csv-flatfile with a document number and line item positions and some other columns with amounts and dates.. The problem is, that I get the same error due to non uniquness.

It is correct that some document numbers occur more than one time due to the numer ober contained line items. But taking both columns as key field they are unique.

Does anybody has an idea, how to import such kind of data?

@Hardik: How did you solve the problem?

Thanks for your support and kind regards

Falk

rama_shankar3
Active Contributor
0 Kudos

Please make both columns part of the key in the table and try loading again.

Regards,

Rama

hardik_patel
Participant
0 Kudos

Hi Falk,

if i recall correctly, i had SAP HANA Binary format of the dataset and I had problem importing those data in to SAP HANA but then i found the process of how to import HANA binary format in to SAP HANA.

what format of data do you have?

Regards,

Hardik

Former Member
0 Kudos

Rama, Harfik,

thanks a lot. It was something different to the issue I posted here. The problem was, that I had 3 lines where for line 3 column 4 was empty whereas for the other lines the column was filled. So I decided not to take this column as key column for what HANA told me that there are some non unique entries.

I added the columnb to the key but due to the NOTNULL prerequisite I had the next problem. The only way to solve it was to give the column a default value of 0 which is ok, but not the way I expected.

Is there any possibility to import data as posted below. These 3 lines exist in the database table, but there seems to be no way, to get them into HANA.

document | line item | amount | repetition

-----------------------------------------------------------

10000001 |      1       | 100,00  | 

10000001 |      1       | 100,00  |      1

10000001 |      1       | 100,00  |      2

Thank you very much for helping on this issue.

Best regards,

Falk

rama_shankar3
Active Contributor
0 Kudos

Falk:

You will have to write the logic in the insert trigger onthe HANA table. Please refer to the developer guides for samples.

Let me know if you have issues finding it. I can send you a link.

Regards,

Rama

rama_shankar3
Active Contributor
0 Kudos

Hardik:

It appears that you have duplicate records in your flat file as per the target table primary keys. For example if you have Order # and Customer # as primary key in your target table, your flat file can not have more than one record with unique order # and customer #.

 

Please check. Hope this helps.

Regards,

Rama