on 08-01-2016 1:59 AM
Dear All,
I'm facing an extracting issue now... While I'm using in-doc sql to extract a table, it fails to create extract file, but if I use a sql file instead, it will succeed.. Here's my sample file --
asiq160@bigDataIQ:/iq_data/bcp> cat ex.sh
#!/bin/bash
echo `date +%F' '%T`:started
dbisql -c "uid=dba;pwd=sql" -host 10.128.244.161 -port 2641 -nogui << EOF
set temporary option temp_extract_name1='export.dat';
set temporary option temp_extract_directory='/iq_data/bcp';
set temporary option isql_show_multiple_result_sets='on' ;
SET temporary OPTION ISQL_PRINT_RESULT_SET = 'ALL';
select * from STOCK;
set temporary option temp_extract_name1='';
EOF
echo `date +%F' '%T`:ended
asiq160@bigDataIQ:/iq_data/bcp> ./ex.sh
2016-07-26 03:00:12:started
(DBA)> Execution time: 0.003 seconds
(DBA)> 2016-07-26 03:00:13:ended
asiq160@bigDataIQ:/iq_data/bcp> ls
ex.sh
=======================================================================================
then if I wrote the sql to a separated file --
asiq160@bigDataIQ:/iq_data/bcp> cat ex.sql
set temporary option temp_extract_name1='export.dat';
set temporary option temp_extract_directory='/iq_data/bcp';
set temporary option isql_show_multiple_result_sets='on' ;
SET temporary OPTION ISQL_PRINT_RESULT_SET = 'ALL';
select * from STOCK;
set temporary option temp_extract_name1='';
and using dbisql to call it ---
asiq160@bigDataIQ:/iq_data/bcp> dbisql -c "uid=dba;pwd=sql" -host 10.128.244.161 -port 2641 -nogui ex.sql
Then we can see the '/iq_data/bcp/export.dat' was generated....
=======================================================================================
Would anyone please tell what's the difference between them?
Thanks in advance for any ideas.
Regards
Eisen
Hi Eisen,
I've seen an approach like your in-doc sql together with the isql utility. This originally comes from the ASE ecosystem, supports OpenServer back ends (like IQ) and has batch processing capabilities that more or less rely on input redirection. I have to admit that I never used this approach with dbisql, for a couple of reasons. dbisql has more powerful batch processing capabilities, like nested scripts using the READ statement, parameter passing or selective query result output redirection. In NoGUI interactive mode (which you're trying to simulate with input redirection), dbisql is pretty poor, even refuses to accept multi- line statements.
So, in brief, I'd say that your in-doc approach is isql'ish whereas the sql file approach is dbisql'ish.
Is there a specific reason why you prefer the in-doc approach?
HTH
Volker
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Dear Volker,
Thank you very much for your answer...
Yeah...You have a sharp view -- Being an 16 years ASE DBA-- the in-doc sql with isql has already been all my shell file style... And frankly, I think the in-doc is more flexible than the pure sql file, as I can use shell variable in the sql body like -- select * from tbname where id>${1}
Anyway.... no matter how I missed the in-doc sql, the dbisql is not good choice for it. I'll try to change my script first. Thanks again. Volker.
Regards
Eisen
Dear Eisen,
the dbisql way to use shell variables is to use the READ statement with parameters. These will be not just used where sql parameters are possible, it's more like text replacement.
So the equivalent to your example would be:
>>
-- Qry0.sql
parameters pid;
select * from tbname where id>{pid}; -- name from the parameters list in curly brackets
<<
call it using
dbisql -c ... read Qry0.sql [${1}]
or using the read statement inside your interactive session (you may need to specify the directory path or at least be aware of the current directory setting of your interactive session. The [] brackets are used to mark the parameter(s) passed to the sql script on the command line. In some cases, they're optional.
I didn't check in reality, hope I didn't put in any typos.
HTH
Volker
Dear Volker,
Thanks for your suggestion...
Yes. I know the read statement can partially replace the in-doc's shell variable with parameters. But to me --
1. The script used to be only 1 file for 1 purpose -- now it must be 1 shell file with several sql files
2. Before, I can pass the shell parameters to sql body directly -- eg.
./listcustomerinfo.sh Customer1 and ./listcustomerinfo Customer2
Now I have to edit that sql file each time, and I think it's not as intuitive as the previous in-doc shell... And sometime -- that shell variable is generated from shell itself with awk or other 3rd functions... like --
...
filesize = `ls -l myfile|awk '{print $5}' `
...
update tablename set col1=${filesize} where col2='myfile'...
It looks it makes the sql body from a dynamic one to a static one...
3. And the sql file is not compatible with Sybase ASE or other DBs. That's the most hard one.
Anyway, it's the workaround for this in-doc sql. Thanks again.
Regards
Eisen
User | Count |
---|---|
87 | |
10 | |
10 | |
9 | |
7 | |
6 | |
6 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.