cancel
Showing results for 
Search instead for 
Did you mean: 

Executin procedure in parallel mode

Former Member
0 Kudos

Hi All

  I am facing performance problems while executin a procedure , I have tried Cursors initally as CURSORS were affecting the performance.I have avoided CURSORS and applied WHILE Loops, still the performance does not improve.

The scenario is a table contains list of quotes and each quote has different status, I am calculating the time spent by a quote in different status, I could figure out the logic, but as I need to repeat the same logic for each distinct quote exising in the table, I need to loop through the table for each quote.

For loop and While loop both are impacting performance. Request your ideas in improving the performance whil looping over table records.

as per  my understanding the logic is being executed sequenitally, if I can execute the logic paralelley for differen quotes it would help  in improving the performance,request your ideas in understanding, if there is any such possibility like using no.of thread etc...

Thanks

Santosh

Accepted Solutions (0)

Answers (3)

Answers (3)

Former Member
0 Kudos

hi Santos's:

  I think that procedure of database maybe cannot to do parallel in sql statement base as my understanding, we only can write 1 thread in one procedure, but as database can execute as parallel, for example, we create one query SQL statement, then database can execute and parse sql statement based on certain strategy,  such as parallel strategy for performance improvement.

anyway, I think that if your procedure performance has trouble without any better solution so far, you can choose anther way to meet you requirement, such as you can create view for different quote status firstly, then you can union created view to fetch and analysis related data.

so it is only my suggestion, hope it can help you to solve you trouble, thanks.

Regards,

Jerry.

Former Member
0 Kudos

Hi Jerry

  Thanks for contributing, I am following the same approach ,like calculate time for different status using different procedures and combine the whole results into one view.all my procedures would be updating single table.and on top of that table I am creating attribute view.

But as I gave an example I am facing performance issues while calculating the time spent in single status, as the no.of records is high.

Thanks

Santosh

lbreddemann
Active Contributor
0 Kudos

Hi Santosh,

please do provide the table DDL, some test data and how your output should look like.

On top of that I'd like to know what is the functional requirement you try to solve here?

As nobody wants to simply stare at a table of quotes and number of days per status transition I assume that you actually want to do aggregated analysis, right?

Things like "what is the status transition that takes longest?" or "top 10 long running quotes in status xyz".

Is this correct or what *precisely* is the functional usage of this query?

- Lars

Former Member
0 Kudos

Hi Lars

   I have attribute view as the source which contains, QuoteID, Time Stamp, old_status, new_status on and for a quote there are different possible status sucj as 10,20,30,40, 50, etc... case can move back and forthe beteen all the status for example a case can move from status 00-10, 10-20 , 20-30,  30-10,

10-20 etc... finally I need to find the total time spent in status 10 similarly for status 20, 30, 40 etc...

My below logic is working fine, but only problem is the perofmance, below uis my sample code snippet.Thanks for sharing your ideas.

quote_TAB = SELECT quote_ID,row_number()over (  order by CASE_GUID) as rownumber
from ( SELECT DISTINCT CA_LOG_CASE_GUID as "CASE_GUID" from

"_SYS_BIC"."packagename/AT_DETAILS")
group by CASE_GUID
;

SELECT count(*) INTO V_QUOTE_COUNT from :quote_TAB ;

WHILE V_COUNTER <= :V_QUOTE_COUNT DO

SELECT quote_ID INTO V_quote from :quote_TAB WHERE rownumber = :V_COUNTER;

T10T = SELECT Quote_ID ,POID_ID ,TIME_STAMP ,old_status ,new_status,left (TIME_STAMP,
  14)as TSTMP, row_number() over ( order by IS_TIMESTAMP)as rowid10
         from "_SYS_BIC"."packagename/AT_DETAILS"
      WHERE new_status = '10' AND Quote_ID = :V_quote;
IF (:V_COUNT > 0) THEN

T10S = SELECT Quote_ID,POID_ID ,TIME_STAMP ,old_status ,new_status ,left (TIME_STAMP,
  14) as TSTMP,row_number() over ( order by IS_TIMESTAMP)as rowid20
       from "_SYS_BIC"."packagename/AT_DETAILS"
       WHERE old_status = '10'
      AND Quote_ID = :V_Quote;

JOINTIME = SELECT:T10T.Quote_ID as Quote ,:T10T.POID_ID as POID,:T10T.TSTMP as TSTMP10,
  :T10S.TSTMP as TSTMP20, COALESCE(NULL,  days_between(:T10T.TSTMP,:T10S.TSTMP),
  days_between(:T10T.TSTMP,TO_TIMESTAMP(now()))) as TIMETEN  from :T10T
         LEFT OUTER JOIN :T10S ON :T10T.rowid10 = :T10S.rowid20 ;


INSERT INTO SCHEMA.TABLE (SELECT Quote,sum(TIMETEN),:V_STATUS  from :JOINTIME
group by CASE_GUID)

ELSE
INSERT INTO SCHEMA.TABLE values (:V_quote,:V_COUNT,:V_STATUS) ;

END IF ;

END ;

Thanks

Santosh

Former Member
0 Kudos

Hi All

   I have converted the target table to row store from column store as the INSERT in column store takes performance hit with that I could see little performance improvement, but still the performance needs to be improved.

Request your ideas .

Thanks

Santosh

lbreddemann
Active Contributor
0 Kudos

Sorry Santosh,

but just out of curiosity: how did you found out about what time was used for the INSERT statement? How can you be sure that you actually improved the runtime by the change?

Did you measure it? If so, how?

- Lars

Former Member
0 Kudos

Hi Lars

   intially my traget table was of type columnar store , when I executed it in dev landscape it took 5 minutes 40 seconds approx, I thought that INSERT in columnar will be little costlier compared to row, hence I changed my Target table to Row Store , the procedure execution time come down to  3 minutes and 15 sec .

between these two executions this is the only change I made, to confirm I executed it several times.Hence I came to conclusion ,but I am yet to check it in Quality system.Please correct me if I my understanding is wrong.

Thanks

Santosh

Former Member
0 Kudos

Hi Santosh,

How many records in your table?

Regards

Jerry

Former Member
0 Kudos

Hi Jerry

    Sorry, I was stuck in some other task and could not check the thread. In development environment we have 5,000 records and it takes 4 minutes. in Test environment I don't have access, it seems very huge number because when they execute it in Test landscape the procedure execution does not come to end.

Request your ideas.

Thanks

Santosh

Former Member
0 Kudos

Hi Santosh,

According to your information, the 5,000 records, it takes 4 minutes for all?

As my understanding, it is much time and performance is not good for only 5,000 records, general, row store type in other DB product, such as DB2, Oracle, SQL Server. the performance is not good for 5,000 records and takes 4 minutes, if the records is greater than 1,000,000, then I think that the execution result is very very bad.

Besides, as you know, the SAP HANA performance is faster than other DB product.

So I think that you need to check and consider your solution again.

Regards

Jerry

former_member182114
Active Contributor
0 Kudos

Hi Santosh,

With loops/whiles/curso you always will have a sequential execution of code.

In order to parallelize you need to rethink how you calculate the time status and imagine sets of data, join everything and compute at end. See a sample of idea without focusing on syntax just to quick expression and I hope understanding:

on this sample status are

01-creation

02-analyzed

03-closed

and figures will be difference between steps:

- time_of_analyzes (difference between 01 and 02)

- time_to_close (02 and 03)

- quote_life_time (01-03)

On this pseudo-code the idea of sequencial code is collect the figures and decide it later, ok?

select quote,date from quotes.

  select status,timestamp from quotes_status where quote=quotes-quote.

      if quotes_status-status = '01'.

         lv_creation = quotes_status-timestamp.

      elseif quotes_status-status = '02'.

         lv_analyzed = quotes_status-timestamp.

      elseif quotes_status-status = '03'.

         lv_closed = quotes_status-timestamp.

       endif

  endselect.

  var_out = select quotes-quote,quotes-date,

          (:lv_analyzed-:lv_creation) as time_of_analyzes,

          (:lv_closed-:lv_analyzed) as time_to_close,

          (:lv_closed-:lv_creation) as quote_time_life from dummy

          union select * from :var_out;

endselect.

To transpose this to "set" thinking, you should have at least:

- 4 projections (quotes, quotes_status_01, quotes_status_02, quotes_status_03)

- left outer join quotes with the other 3 views

At end you will have in hands:

- one view all fields you need for calculation

- do a projection and made the computation to collect the figures

** Probably your IF's aren't so simple like this, but it's the basic Idea, you can add the IF's adding more filters on each view or add more views with more information.

The important to understand is that you need to think in sets of data despite loop each row.

Regards, Fernando Da Rós