cancel
Showing results for 
Search instead for 
Did you mean: 

Bulk Insert of data into DB

Former Member
0 Kudos

Hi all,

I am looking for some ideas for a bulk insert of data into a given database (Oracle in this case). Say we are getting a RFC from SAP which causes MII to insert some 100 rows into a table. What are the alternatives using MII?

A repeater and row by row insert is very slow. One possibiliy may be to dynamically build insert strings in MII into one string variable and then call a query using this variable as SQL statement. I will try this to check if it is faster that the one-row-at-a-time try.

Are there any more ideas to improve performace with MII having to insert many rows?

Michael

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Most likely, with really large numbers of records, the optimal approach is to transfer the "bulk" data as XML or text file to a location that the Oracle server can see (perhaps via FTP), and then initiate a bulk insert as you've done with your 50 record "chunk", but for all of the records at once. For smaller # of records, the approach you've done with sending blocks of records to a SP should work well.

You might also want to consider a custom action that opens/obtains a JDBC connection, turns of "autocommit", does the inserts, commits, and closes the connection. This could increase performance substantially. Unfortunately, I have no way to easily benchmark it relative to the other alternatives.

Former Member
0 Kudos

Hi Rick, Hi Michael,

Michael, yes that is exactly what I mean. Ricks solution with the the FTP we had one time at a

customer. They had per Month a file with a size of approx. 80MB (more then 200.000 rows) and liked

to import this to Oracle. We suggest to import the file directly with the DB Software and the XMLLoader.

I am not sure, but I think the import was extremly fast (approx. 3 Min), but may be i am wrong.

If you like to know the result of the import, please let me know and I will ask the customer.

Cheers

Pedro

Former Member
0 Kudos

Thanks for your answers!

So if my requirement is to load a big file with many rows into the system not very often, I would consider using the LOAD DATA from file approach, which seems to be the fastest because all rows are loaded in one step.

If my requirement is to frequently load many rows, I may use the file load or the direct transfer of XML data and the XSU approach.

If the requirement, like in our case, is to populate more than one table from one RFC input, which may result sometimes in inserting few rows, sometimes many rows, I think the direct approach using XSU may be a good option.

I will test it in the next days and come back with some performance results.

Michael

Former Member
0 Kudos

Hi Pedro,

maybe I have to come back to the XMLLoader solution.

Today I started to work on the "real life" example, a 100KB XML which I have created in a BLT. I wanted to pass the XML to a query like described above, but the query ended up with the error "parameter too long". I suppose that parameters in queries are limited in size. Smaller XML documents could be loaded without problems.

Michael

Former Member
0 Kudos

Hi Michael,

at the moment I have only MS SQL where I can try a test with the SP. I will test it over the weekend

and let you know the result.

Regards

Pedro

Former Member
0 Kudos

Pedro,

thanks for your help.

After some testing I found out some more issues.

In the MII query, you have to use the TO_CLOB function in the assignment, otherwise MII complains the string size if the XML doc is too big (I am testing with 100KB), see the example above.

My table has an ID column which is populated by a sequence. The XMLSaver however seems to be unable to use the sequence.nextval function in the XML doc, so I have "reserved" a number interval and hardcode the ID in the XML.

Finally I run into an ORA-04031: "unable to allocate 100000 bytes of shared memory" error. Feels like stop and go...

Michael

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi Michael,

If you have the bulk data in Excel file or CSV file then you can easily insert the data in Oracle tables

by using an oracle utility called "SQL Loader".

SQL*Loader is Oracleu2019s utility program for loading data into an Oracle table.

Most often, SQL Loader needs two input files to load the Data.

1. Control file - The control file contains information about the data. where to load it, what to do if something goes wrong, etc.

2. Data file - The data file contains data in .csv format.

If required then the data in CSV file can also be used to UPDATE the data in Oracle tables.

Let me know if you need more inputs about SQL Loader.

Regards,

Anil

Former Member
0 Kudos

Anil,

thanks for your answer.

One question to SQL Loader, as I have just found another problem. One of my tables use an ID column which is currently filled by a sequence.

Is it possible with SQL Loader to insert the table using the sequence.nextval function? At the moment I am looking for ideas how to fill this column using the bulk insert.

One workaround was to check the number of rows that have to be inserted, then increase the sequence last.number accordingly to "reserve" an intervall, and then set the ID per row from the intervall numbers. I have found no way to use the nextval function in the bulk insert.

Michael

Former Member
0 Kudos

Michael,

A trigger can solve your purpose.

suppose you have a table "TABLE1" having 3 columns - ID, COL1, COL2.

I assume you have also created a sequence, say "SEQ1".

Load the values of COL1 and COL2 through sqlloader.

ID will be automatically populated through the trigger.

here is the trigger code:

CREATE OR REPLACE TRIGGER "Table1_INCR_ID_RBI"

BEFORE INSERT

ON table1

FOR EACH ROW

DECLARE

BEGIN

SELECT SEQ1.NEXTVAL

INTO

:NEW.ID

FROM

Dual;

END;

Regards,

Anil

Former Member
0 Kudos

Michael,

There are 2 easy ways to do it through control file in sqlloader.

1. In your control file:

load data

into table TABLE1

(

ID "SEQ1.nextval";

COL1 ....;

COL2 ....;

)

2. Or use SQL*Loader's own SEQUENCE generator (it might be faster than calling database's sequence NEXTVAL).

load data

into table TABLE1

(

ID SEQUENCE(MAX,1)

COL1 ....;

COL2 ....;

)

The SEQUENCE function is used to generate a unique value in the column ID. This function finds the current maximum value in column ID and adds the increment (1) to it to obtain the value for ID for each row inserted.

Regards,

Anil

Former Member
0 Kudos

Anil,

thanks for the description of the trigger, this should solve my problem with the sequence.

After changing the SGA size on the database I hope to also solve my error saying "cannot allocate x bytes of shared memory".

I consider the SQL Loader as a last alternative because I would have to use FTP to get the file from MII to the db server, which may cause new problems. I prefer to use a MII query to execute the SQL (or PL/SQL) statement.

Anyway its good to have alternatives when you've got stuck.

Michael

Former Member
0 Kudos

Hi Michael,

I testing the StoreProcedure with the following scenario/results:

+ Windows 2003 Server Std 32Bit (4GB RAM)

+ MS SQL 2000 SP4

+ MII 12.0.6

+ 100 Rows (XMLFile = 49KB) spent 185ms - 197ms.

+ 1000 Rows (XMLFile = 479KB) spent 4734ms - 5354ms.

+ 10000 Rows (XMLFile = 4786KB) spent 420123ms - 430221ms.

Regards

Pedro

Former Member
0 Kudos

Hi Pedro,

thank you for testing! Sounds good!

I am checking right now how to use the SQL Loader for our purpose. At the same time I am working on the alternative I have described above, using the XSU. The problem with XSU call is that the XML document created in MII have to be passed in the query as a string to the PL/SQL variable. Strings in PL/SQL however may only be 32KB in size. So I have to break up the string into 32KB chunks.

Perhaps I can also benchmark the difference of the "chunk" and the Loader alternatives.

At last we are also checking where the time is spent when an MII transaction calls a query. Perhaps we can also speed this up.

Michael

Former Member
0 Kudos

Hi Michael,

if I remember right you can use the XMLLoader in Oracle. If not, then the best would be

to write a own StoreProcedure. Let me know the result.

Regards

Pedro

Former Member
0 Kudos

Hi Pedro,

I will have a look at the XMLLoader and tell you the results.

Concerning Stored Procedures (SP)...we are using them, but my problem is how to get the data to the SP. As an example, I have to store 150 rows with 34 fields. I would have to build a two dimensional array with the values in MII, then call a query which calls the SP. Maybe it is possible to pass a XML from the tran to the query to the SP, and the SP interprets the XML (XMLLoader?).

Michael

Former Member
0 Kudos

Hi Michael,

it is long time ago, but I think it is possible. I will check and come back to you.

Regards

Pedro

Former Member
0 Kudos

Pedro,

today I did my first successful test using the XSU (XML SQL Utilities) in Oracle.

I have created a stored procedure that takes a XML document with the rows, columns and values to be inserted, and the table name.


create or replace PROCEDURE PROC_BULKTEST 
  (xmlDoc in CLOB, tableName in VARCHAR2, refcur in out <somedefinition>.return_cur)
AS
  insCtx dbms_xmlsave.ctxType;
  rowcnt number;
BEGIN
  insCtx := dbms_xmlsave.newContext(tableName);
  dbms_xmlsave.setignorecase(insCtx,1);
  rowcnt := dbms_xmlsave.insertxml(insCtx, xmlDoc);
  dbms_xmlsave.closecontext(insCtx);
  
  OPEN refcur FOR SELECT rowcnt AS cnt FROM dual;
END PROC_BULKTEST;

The setignorecase is important, because the XSU is case sensitive by default and expects "ROW", but MII sends "Row".

The other problem is that the default MII XML format (Rowsets - Rowset - Row) is not valid in XSU. You have to get rid of the "Rowsets" nodes. If the XML only contains the "Rowset - Row" nodes, it is ok.

You can call the SP by using a SQLQuery with FixedQueryWithOutput mode. The query definition I have used:


DECLARE
  xmlDoc CLOB;
  tableName VARCHAR2(100);
BEGIN
  xmlDoc := TO_CLOB( '[Param.1]' );
  tableName := '[Param.2]';
  
  PROC_BULKTEST(
    xmlDoc,
    tableName,
    refcur => ?
  );

COMMIT;

END;

I do not know it this is the solution you had in mind? However, it seems to speed up the insert. I feeded the SP with a XML containing 50 rows, and it took around 500ms. Using a repeater with a command query that inserts every row one by one, it took around 10000ms.

Michael

Edited by: Michael Otto on Jan 13, 2009 4:19 PM

Edited by: Michael Otto on Jan 16, 2009 11:51 AM