Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Parallel Processing

Former Member
0 Kudos

Hi All - We have a program in which we have a select statement

thats joins 3 tables and fetches large amount of data. This program is quite frequently run and takes about a couple minutes each time. Just wanted to know if we can use parallel processing for this.

1. Is it possible to use the parallel processing technique for a Select statement?

2. Will fetching the data using Packet size option improve performance?

3. Can Parallel processing be used to programs run in foreground?

Regards.

Sam.

9 REPLIES 9

mvoros
Active Contributor
0 Kudos

Hi,

what do you mean by parallel processing? It is obvious that your bottleneck is DB. I don't see any ways how to paralize this process. You can try to improve your query performance by optimizing it (maybe create a new index and so on). Or you can try to tune your query on the DB level using DB hints.

So my quick answers:

1) I do not think so.

2) I do not know what do you mean.

3) You can create sub tasks, run them in background and wait for results from all tasks. There is example in ABAP documentation for this scenario. Now I do not have access to the system.

Cheers

Former Member
0 Kudos

Hi Sam,

Answer to question 1:

As long as you can select your data in many sub packages (select smaller amount of data), and re-assemble it again together to get the final result that you need, then you can use parallel processing technique to improve the performance of the program.

Answer to question 2:

The performance is normally measured by number of data retrieved compared to time used to get them. You can improve the performance by following these 5 principles below:

2a. Keep the result set small

2b. Minimize the amount of data transferred

2c. Minimize the number of data transfers

2d. Minimize the search overhead

2e. Reduce the database load

Good if you use SE30 first to evaluate the cause of the problem, whether it lies with your server or your program. If it's caused by server, then you need basis and your management to enhance the infrastructure

Answer to question 3:

Yes, you can run parallel processing in foreground. You can use IDOC parallel program RBDAPP01 as a reference to create similar program.

In summary, I would recommend you to make sure the bottleneck of the performance first by using SE30. If the problem is with the code, then try to focus in 'select' and 'loop' statement. You can check whether your select statement is effective by using ST05. If you have nested loop, make sure you follow parallel cursor approach as mentioned in SE30 tips and tricks.

Hope it helps.

0 Kudos

Thanks. Can you please give me an example or guide on how to use parallel processing for a select statement

Thanks & Regards.

Sam.

0 Kudos

Hello

Try on tcode SE30 on tips and trick button....there you will find samples on parallel procesing....

anyways check this:



* Entries: 1000 (ITAB1), 300 (ITAB2)
* Line width: 100
* Both tables sorted by unique key K ascending
DATA: I TYPE I.

I = 1.
LOOP AT ITAB1 INTO WA1.
  do.
    READ TABLE ITAB2 INTO WA2 INDEX I.
    IF SY-SUBRC <> 0. EXIT. ENDIF.
    IF WA2-K < WA1-K.
      ADD 1 TO I.
    ELSEIF WA2-K = WA1-K.
      " ...
      ADD 1 TO I.
      EXIT.
    ELSE.
      EXIT.
    endif.
  enddo.
  if sy-subrc <> 0. exit. endif.
ENDLOOP.


sagarmehta
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Sam,

Try replacing the following code with relative variables:-

DATA: lv_taskNumber TYPE I VALUE 0,              "variable to hold the current 
          lv_total_no_of_tasks TYPE I VALUE 0.

DATA: BEGIN OF lt_per_content OCCURS 0,
	pernr TYPE pernr_d,
	internal_table TYPE TABLE OF workArea,
	taskNumber TYPE I,
          END OF lt_per_content.

DATA ls_per_content LIKE LINE OF lt_per_content.

***Fill the internal table  lt_per_content  with the pernr list***

LOOP AT lt_per_content INTO ls_per_content.
      "Associate a unique task number with each pernr
      ADD 1 TO lv_taskNumber.                   
      
      ls_per_content-taskNumber = lv_taskNumber.
      MODIFY lt_per_content FROM ls_per_content.
      ADD 1 TO lv_total_no_of_tasks.                      "Add to total task list

***Calling the function module for processing the data
***Also the following code starts the New Threads
     CALL FUNCTION 'RFC_FUNCTION_OR_ANY_OTHER_FUNC' 
         STARTING NEW TASK lv_tasknumber
         PERFORMING task_finished ON END OF TASK    
         EXPORTING          
              IMPERNR     = ls_per_content-pernr
         TABLES
              EX_TABLE   = ls_per_content-internal_table.
ENDLOOP.
    
WAIT UNTIL lv_total_no_of_tasks EQ 0.


FORM task_finished USING current_task TYPE any.

     SUBTRACT 1 FROM lv_total_no_of_tasks.

     LOOP AT lt_per_content INTO ls_per_content WHERE taskNumber = current_task.
          RECEIVE RESULTS FROM FUNCTION 'RFC_FUNCTION_OR_ANY_OTHER_FUNC' TABLES 
                         EX_TABLE   = ls_per_content-internal_table.
          MODIFY lt_per_content FROM ls_per_content.
         EXIT.
     ENDLOOP.

ENDFORM.

I Hope this piece of code is of some help to u..

Regards,

Sagar.

PS: try substituting the internal_table with the table that u need

Former Member
0 Kudos

Hi

You can use remote function module for this purpose.

if you call remote function module asynchronously, then its processing will be executed in another work process. So parallel processing will be achieved.

kesavadas_thekkillath
Active Contributor
0 Kudos

Parallel sessions can be opened CALL FUNCTION ' ' as new task.

just check the F1 help...

But i don think this will meet ur req...

Former Member
0 Kudos

as long as you do no update, parallel processing is always possible. you are telling that you use a join-statement and it takes a lot of time because of the amount of data. the join does work with temporary tables so this does not claim any time for access to the database. ABAP has to go through the temporary files and that can take a lot of time. in such a case the join is sometimes not the solution. try normal nested select statements. if this takes longer then the join then the join is the better way and you have to accept the waiting time.

Best Regards,

Guido

0 Kudos

Thank you. Can you just give me an example of ow to acheive this parallel processing for the select statement.

Thanks