cancel
Showing results for 
Search instead for 
Did you mean: 

how to kill Data services second instance of the job

Former Member
0 Kudos

Hi All,

is there a way to kill automatically the second instance of the job which has been kicked off accidentally or scheduled to run ?

we have a situation - job needs to be killed if its kicked while first instance of the job is running .

Thanks 

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi Surya,

The best way is make a control table with job_name and it's run status. Add a script in job that whenever job starts execution the status will be set to 1 in the control table and after successful execution of job status will set back to 0 again. And before job starts execution it will check for the run status in that control table and if it's 1 then that means a instance of a job is already running, So if it's already running don't execute the job.

Regards,

Gokul

Former Member
0 Kudos

Thank a lot for your reply , let me try.

Former Member
0 Kudos

created a job and tested, its working fine. Thanks again.

Answers (3)

Answers (3)

Former Member
0 Kudos

Great.!!! Please close the thread.

Former Member
0 Kudos

You could use a conditional to check that there's only one instance of the job running (perhaps by checking the repository tables/views), and if so then the job started manually or scheduled would quit automatically. We use something similar to schedule a second instance of a particular job in case the first one fails, and when it runs the second time checks the repository table to see if the first one finished, has errors, or is still running and then acts accordingly.

Former Member
0 Kudos

Why do you need to kill the job? Better stop the execution of second job until the first one is completed successfully. Add a script after the dataflow of first job to capture the job execution status of first job. Check this