on 09-27-2007 9:36 AM
Hi
In BLT I am FTPing a CSV file with approx 7000 lines, then loading it with "TextLoader" and parsing with "FlatFileParser" then stepping through with a repeater running an SQL insert.
It all works OK, but takes about half an hour, I was wondering if anyone had any better suggestions?
Thanks
Nick
PS Here is a sample of the CSV file
"Ident","Make","Model","Description","Serial","LastCal","NextCal","DueDate","StatusChanged","CalStatus","Live","Job"
"xxx1706","AGILENT","34401A","DIGITAL MULTIMETER","US3612","2003-09-12","2003-09-12","2003-09-14","N","ok","N","10410"
"yyy1724","AGILENT","8482H","POWER SENSOR","MY4109","2001-04-18","2002-03-18","2009-03-21","N","so-so","N","1048"
My repeater strips out the header row ([LineNumber > 1]), and my insert is just
INSERT into mytable VALUES ([Param.1])
The only other thing I have to do is a string replace on each line as follows
stringreplace( stringreplace( stringreplace( Repeater_0.Output{/Row/Line} , "'" , "" ) , "\"" , "'" ) , "'None'" , "''" )
to first of all strip out single quotes (found in the description field), then replace double quotes with single quotes, then replace empty date fields (in the csv file as 'None') with an empty string.
Nick
Generally in BLS when repeater takes too large rows from one file, it takes a lot time of execution. (By hitting F5 / F6)
But compare the execution time of BLS with an XacuteQuery of the same BLS, hope the time of execution in XacuteQuery (hitting Test) will take less time.
Though i dont know whether this will saisfy u with time performance of XacuteQuery rather than time performance of BLS itself.
Regards
Som
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Nick
Well, there is no built-in BLS Action called <b>SQL Loader/SQL Saver</b>. Only SQL Query is must for retrieving/loading and inserting/updating the data from(into) SQL Server, if u want.
Wat u used in ur current BLS using Insert Command in SQL Query thru Repeater is perfect to insert the data into SQL Db
Regards
Som
Nick:
I would use SQL*Loader also. Info at:
http://www.orafaq.com/faqloadr.htm
Any time you are doing "bulk inserts", there are special DB techniques that can avoid substantial overhead (e.g. don't commit each row, grab a connection once, etc.). This automatically happens with SQL*Loader.
Alternatively, you might be able to run some PL/SQL with the "LOAD DATA" syntax (and specifying your CSV file with INFILE). Google for examples. Also, there are many performance options that can make a HUGE difference in load speed.
You could theoretically write a custom BLS action to do this also.
Rick
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Nick,
If you have a standard .csv file sitting on the OS, then oracle has a feature of external table. you could create a readonly database table refering to this .csv file.
If you want to use SQL loader, using DIRECT path option will speed up the loading significantly, in its control file you specify the delimiter, the fields and the target table.
Hope this helps.
Cheers,
Nisch
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
12 | |
7 | |
3 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.