Skip to Content

Archived discussions are read-only. Learn more about SAP Q&A

Load Table inserting too many rows

Hello, I am trying to extract and load data on the same IQ 16 server (on Windows). The data extract seems fine, and the load works but it inserts too many rows. I have 2 tables, Dwelling_New with the data, and Dwelling_New_loadtest which is an exact copy but empty for testing the load.

This is my code:

set temporary option Temp_Extract_Size1 = '134217728';

set temporary option Temp_Extract_Binary = 'ON';

set temporary option Temp_Extract_Directory = 'E:\archive\';

set temporary option Temp_Extract_Name1 = 'Dwelling_New.dat';

select * from Dwelling_New;

set temporary option Temp_Extract_Name1 = '';

load table Dwelling_New_loadtest (col1, col2, col3 binary with null byte)

using client file 'E:\\archive\\Dwelling_New.dat'

quotes off

escapes off

format binary;

The load works but inserts 8,943,227 records whereas the original table only has 7,567,346 records.

Why is this happening?

Thanks,

Former Member
replied

Looks like you are missing the "binary with null byte" after each column.  You only have it after col3.  Our binary extract drops a null byte after each column so you need to specify it for each.

load table Dwelling_New_loadtest (col1 binary with null byte, col2 binary with null byte, col3 binary with null byte)

using client file 'E:\\archive\\Dwelling_New.dat'

quotes off

escapes off

format binary;

Also, omit the using client file.  Instead use "using file".  When you do an extract it is on the server.  the USING CLIENT FILE would be a client side load which will be slower.  Assuming that the file is on the server, you want the best performance.

Perhaps it is an issue of the file being on a machine different than the IQ server??

Mark

0 View this answer in context
Not what you were looking for? View more on this topic or Ask a question