cancel
Showing results for 
Search instead for 
Did you mean: 

CSV to Oracle Table

Former Member
0 Kudos

Hi

In BLT I am FTPing a CSV file with approx 7000 lines, then loading it with "TextLoader" and parsing with "FlatFileParser" then stepping through with a repeater running an SQL insert.

It all works OK, but takes about half an hour, I was wondering if anyone had any better suggestions?

Thanks

Nick

PS Here is a sample of the CSV file

"Ident","Make","Model","Description","Serial","LastCal","NextCal","DueDate","StatusChanged","CalStatus","Live","Job"

"xxx1706","AGILENT","34401A","DIGITAL MULTIMETER","US3612","2003-09-12","2003-09-12","2003-09-14","N","ok","N","10410"

"yyy1724","AGILENT","8482H","POWER SENSOR","MY4109","2001-04-18","2002-03-18","2009-03-21","N","so-so","N","1048"

My repeater strips out the header row ([LineNumber > 1]), and my insert is just

INSERT into mytable VALUES ([Param.1])

The only other thing I have to do is a string replace on each line as follows

stringreplace( stringreplace( stringreplace( Repeater_0.Output{/Row/Line} , "'" , "" ) , "\"" , "'" ) , "'None'" , "''" )

to first of all strip out single quotes (found in the description field), then replace double quotes with single quotes, then replace empty date fields (in the csv file as 'None') with an empty string.

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Nick

Generally in BLS when repeater takes too large rows from one file, it takes a lot time of execution. (By hitting F5 / F6)

But compare the execution time of BLS with an XacuteQuery of the same BLS, hope the time of execution in XacuteQuery (hitting Test) will take less time.

Though i dont know whether this will saisfy u with time performance of XacuteQuery rather than time performance of BLS itself.

Regards

Som

Former Member
0 Kudos

Thanks, I think this was a little quicker.

I also moved my replace further up the procedure, to the Flat File Parser input, and I think this helped too.

One of my colleagues has suggested using SQL Loader, does anyone know about calling this command line utility from within BLS?

Former Member
0 Kudos

Nick

Well, there is no built-in BLS Action called <b>SQL Loader/SQL Saver</b>. Only SQL Query is must for retrieving/loading and inserting/updating the data from(into) SQL Server, if u want.

Wat u used in ur current BLS using Insert Command in SQL Query thru Repeater is perfect to insert the data into SQL Db

Regards

Som

Answers (2)

Answers (2)

Former Member
0 Kudos

Nick:

I would use SQL*Loader also. Info at:

http://www.orafaq.com/faqloadr.htm

Any time you are doing "bulk inserts", there are special DB techniques that can avoid substantial overhead (e.g. don't commit each row, grab a connection once, etc.). This automatically happens with SQL*Loader.

Alternatively, you might be able to run some PL/SQL with the "LOAD DATA" syntax (and specifying your CSV file with INFILE). Google for examples. Also, there are many performance options that can make a HUGE difference in load speed.

You could theoretically write a custom BLS action to do this also.

Rick

Former Member
0 Kudos

Nick,

If you have a standard .csv file sitting on the OS, then oracle has a feature of external table. you could create a readonly database table refering to this .csv file.

If you want to use SQL loader, using DIRECT path option will speed up the loading significantly, in its control file you specify the delimiter, the fields and the target table.

Hope this helps.

Cheers,

Nisch