cancel
Showing results for 
Search instead for 
Did you mean: 

Database command execution error

Former Member
0 Kudos

Hi all,

I have this kind of problem:

On quite huge repository (above 1M records) I'm getting error when I try to add record to qualified table.

Log files on the server side looks as follow:

<Trace ..... tid="4936" entry-no="5542">OLEDB error on SERVER/REP_NAME from statement: INSERT INTO A2i_1_85 (Id, SubId, LinkId, F69, F71) VALUES (1348317, 1, 1348625, 1, 1)<LINE-FEED/></Trace>

<Trace ..... tid="4936" entry-no="5543"> ErrorRecord->HResult/NativeErr: 0x80040e2f/0<LINE-FEED/> Description: The statement has been terminated.<LINE-FEED/> Source/File/Line: Microsoft OLE DB Provider for SQL Server/.\OLEDB_DBConnection.cpp/1744<LINE-FEED/></Trace>

<Trace ..... tid="4936" entry-no="5544">ErrorRecord->HResult/NativeErr: 0x80040e2f/0<LINE-FEED/> Description: Cannot insert duplicate key row in object &apos;A2i_1_85&apos; with unique index &apos;I1_85_LId&apos;.<LINE-FEED/> Source/File/Line: Microsoft OLE DB Provider for SQL Server/.\OLEDB_DBConnection.cpp/1744<LINE-FEED/></Trace>

Its second time, last time when this problem occurred whole repository goes to trash, because each Load process ends with error.

Hint: Check repository function ends with message saying that repository is OK.

My questions is:

- does anybody gets this kind of error ?

- are there any limitation on number of records that can be stored in MDM (except amount of available memory) ?

- how to fix it and get back repository live ?

Thanks in advance,

marcin

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hey

I havent heard of any such limitation

Regards,

Tanveer.

<b>Please mark helpful answers</b>

Former Member
0 Kudos

Hi

I am not sure but is it that the duplicate key record was being inserted?

<i><b>Description: Cannot insert duplicate key row in object &apos;A2i_1_85</b>&apos; with unique index &apos;I1_85_LId&apos;.</i>

Regards,

Ajay

Former Member
0 Kudos

Hi, thanx for reply Ajay

I know that MDM is trying to insert rows that will duplicate key in the underlying database (MS SQL) table.

My question is why MDM Server is going to use the same key value.

This key value is managed by MDM itself - not exposed to the user of MDM services.

Maybe it's because the volume of records stored in repository, that's why I'm asking about some limitations, or maybe I should look into MDM Server configuration ?

Former Member
0 Kudos

Hi Marcin.

1M is perfectly fine for the main table. I suggest unloading the repository and loading it using "Update Indices", that should clean things up a bit in the underlying database.

Former Member
0 Kudos

Hi Alon, thx for reply.

As You suggested I load repository with update indices option the repository and its back.

But unfortunately after adding few more records (using Data Manager) it fails again (same effect).

I still think that there is some issue with qualified lookup tables and transactional between MDM Server and underling database. When add record operation fails the data entered into qualified table is still in the database table (that's why another try to add record fails with duplicated index).

Regards, marcin

Former Member
0 Kudos

Marcin,

What are the unique keys in your qualified table? Are they all part of "Non-Qualifier" fields?

If you add a combination of Non-Qualified fields to the Qualified table, this is available as a "Qualified Link" and it looks like you are trying to add another qualified link with same unique key combination instead of picking up the "Qualified Link" from the "Available Qualified Links". Is this the case?

Regards,

Rajani Kumar

Former Member
0 Kudos

Rajani thanx for useful info, problem was something else.

I spend few days on it and it seems that it was problem with repository size (in memory).

On record add - MDM server was trying to allocate memory and fails due to 32bit system arch. (bye the way it will be nice to have proper error message). Unfortunately part of added (in new record) data - after failure was left in the underling database.

My filling is that there is lack of transactionality so data from the qualified table left in database table, and each another try to add record fails due to unique index duplication.

Regards, marcin