cancel
Showing results for 
Search instead for 
Did you mean: 

Stored procedure runs extremely slow in certain scenarios

Former Member
0 Kudos

Greetings!

In our batch scheduler, we call the same stored procedure (let's call it "my_proc") twice.

batch job 1a: execute my_proc("dinosaurs")

batch job 1b: execute my_proc("humans")

Now I list it as 1a and 1b because they can run in any sequence (normally 1a always runs first, followed by 1b 2 minutes later in reality).

My problem:

When 1a runs first, followed by 1b, the result is:

1) "dinosaurs" ends up dealing with about 30k records and runs in about 6 seconds

2) "humans" ends up dealing with about 180k records and runs in about 20 minutes

It's been this way for a long time, so we tried a couple things to improve the situation to confusing results:

Experiment 1) Recompiled the stored procedure and then executed batch job 1b, so that it'd store the query plan for this first run and it finished in about 35 seconds, which is a vast improvement from 20 minutes. Next, batch job 1a ran that night in its normal 6 seconds. About 24 hours later, we let the batch run on its own so 1a runs first in 6 seconds, but 1b runs in its normal 20 minutes again!

Experiment 2) Force job 1b to always run first, followed by 1a. This resulted in job 1b always running in 35 seconds, followed by 1a running in 6 seconds!


Question

Now one solution we have is to force a dependency so that 1a depends on 1b before execution, however we'd like to decouple these jobs as they really shouldn't depend on each other from a data perspective. Does anyone know what the problem is and what the possible solutions may be?

One theory I have is that the procedure cache is coming into play here, but, if that is the case, I don't know how to fix this unless I create a new stored procedure maybe with the same name? (Before you ask me to try this, we can't replicate this in a test environment as they always run fast with the same data set for some reason, trying various combination of execution sequences. Production is where we are experiencing this performance issue.)


Thank you in advance!

Former Member
0 Kudos

Thank you for the extremely helpful reply, Mark!

It would probably be easier for us to implement option 2 (create proc ... with recompile). I'll look into this more and hope to test this in the production scenario.

So if we use "with recompile", Sybase will explicitly not use whatever's in the cache, correct? Does Sybase purge the query plan from cache (if there exists one)?

Former Member
0 Kudos

Run optdiag and check if there is any skew in stats. Run sp_modify_stats to remove skew. Also check if tables are fragmented.

If you have UNION in the query then dismantle the query.

Check if the index having same  columns with same columns order for composite index between servers.

Check if there is any data type mismatch in predicate columns.

Mark_A_Parsons
Contributor
0 Kudos

For option 2 just keep in mind (and this is more for anyone coming across this thread in the future, eg, via a google search) ...

Creating a proc 'with recompile' will require the proc to be compiled *every* time it's executed.  This is likely ok if the proc is not executed very often and/or the compilation overhead is minimal.  On the other hand, using this method for a proc that is executed a lot of times in a short period of time, or which requires a high overhead for compilation, could degrade performance not only for most of the folks running the proc but also eat up precious dataserver cpu cycles that could be better used elsewhere.

At one client there was a developer who saw how well 'create proc ... w/recompile' worked for a particular batch process (ie, proc was not called very often, and the forced recompilation meant the difference of the proc running in 2 minutes vs 5+ hrs), that he decided to use 'create proc ... w/recompile' for another proc that would occasionally go walkabout.  The problem with his solution ... this particular proc was called ... literally ... a million times a day; said proc normally ran in something like 6ms; the forced recompilation for *every* execution of the proc added upwards of 80ms to each execution of the proc, bumped up overall dataserver cpu usage by 350%, and caused a lot of contention on the system tables used for compilation. In that particular case we had to spend more time understanding why the proc went walkabout and then come up with a solution that did not require excessive use of 'with recompile'.

--------------------------------

As for your proc cache question ...

I suggest you run some tests and pay attention to what happens to the contents of master..monCachecProcedures (mCP).

In particular, pay attention to a) the number of entries in mCP for your proc and b) how mCP.CompileDate changes when a proc is (re)compiled.

kevin_sherlock
Contributor
0 Kudos

When you "create proc xxx with recompile" - When you execute that proc, a plan is cached and executed, and then immediately removed after execution.  There is no other task that will ever reuse that plan because it is removed from proc cache.

When you "execute proc yyy with recompile" - After execution of the proc, a cached plan is left behind in proc cache, and can be reused by another task as long as it stays in proc cache.   This is why in the documentation it states that it can swamp proc cache with many plans for this procedure if you execute in this manner repeatedly.

Disadvantages to either approach only really concern repeated executions, in tight loops for example.  As Mark states, the disadvantage with "create with recompile" is the compile time incurred if the proc is executed many times.  Disadvantage with "exec with recompile" is that many plans will be cached (one for each such execution), and force other plans out of cache.  Not only that, you might not _want_ another task to get your cached plan because it was compiled with boundary parameters etc.

Former Member

Thanks! This is really helpful. It looks like "create... with recompile" is what I need. This procedure only runs nightly, so the overhead is not much of a concern.

Accepted Solutions (0)

Answers (0)