Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

need idea to improve performance

Former Member
0 Kudos

hi friends,

i need idea how to improve performance for this code :



 LOOP AT fl_tab INTO wa_fl_tab.

      AT NEW pernr.


        READ TABLE check_flex INTO wa_check_flex WITH KEY schkz  = 'FLEX'       pernr = wa_fl_tab-pernr .

        IF wa_check_flex-begda > so_date-low.

          so_date1-low = wa_check_flex-begda.          
          so_date1-high = so_date-high.
          so_date1-sign = 'I'.
          so_date1-option = 'BT'.

          APPEND so_date1. 

          SELECT *
          FROM catsdb
          INTO CORRESPONDING FIELDS OF TABLE cats_tab           
        FOR ALL ENTRIES IN fl_tab
          WHERE pernr = wa_fl_tab-pernr
          AND  workdate IN so_date1 .

        ELSE.


          SELECT *
          FROM catsdb
          INTO CORRESPONDING FIELDS OF TABLE cats_tab 
*         FOR ALL ENTRIES IN fl_tab
          WHERE pernr = wa_fl_tab-pernr
          AND  workdate IN so_date .

        ENDIF.


        LOOP AT cats_tab ."Collect hours

          MOVE: cats_tab-pernr  TO cats_tab_col-pernr,
                cats_tab-catshours TO cats_tab_col-catshours,
                cats_tab-status TO cats_tab_col-status.

          COLLECT cats_tab_col .

        ENDLOOP.


        LOOP AT fl_tab ASSIGNING <fl_tab>.

          READ TABLE cats_tab_col ASSIGNING <cats_tab_col> WITH KEY pernr = <fl_tab>-pernr.
          IF sy-subrc = 0.
            <cats_tab_col>-teken_hr = <fl_tab>-mostd.

          ENDIF.

        ENDLOOP.

      ENDAT.

    ENDLOOP.


Regards

1 ACCEPTED SOLUTION

ferry_lianto
Active Contributor
0 Kudos

Hi,

Please try this ...


RANGES: r_pernr1 for catsdb-pernr,
        r_pernr2 for catsdb-pernr.

LOOP AT fl_tab INTO wa_fl_tab.

  AT NEW pernr.
 
    READ TABLE check_flex INTO wa_check_flex 
               WITH KEY schkz  = 'FLEX'       
                        pernr  = wa_fl_tab-pernr .
 
    IF wa_check_flex-begda > so_date-low.
 
      so_date1-low = wa_check_flex-begda.          
      so_date1-high = so_date-high.
      so_date1-sign = 'I'.
      so_date1-option = 'BT'.
      APPEND so_date1. 

      r_pernr1-sign = 'I'.
      r_pernr1-option = 'EQ'.
      r_pernr1 = wa_fl_tab-pernr.
      APPEND r_pernr1.         
          
    ELSE.
   
      r_pernr2-sign = 'I'.
      r_pernr2-option = 'EQ'.
      r_pernr2 = wa_fl_tab-pernr.
      APPEND r_pernr2. 
 
    ENDIF.   
  ENDAT.
ENDLOOP.       
       
SELECT *
FROM catsdb
INTO CORRESPONDING FIELDS OF TABLE cats_tab           
WHERE pernr in r_pernr1
  AND  workdate IN so_date1.
 
 
SELECT *
FROM catsdb
APPENDING CORRESPONDING FIELDS OF TABLE cats_tab 
WHERE pernr IN r_pernr2.
  AND workdate IN so_date.

LOOP AT cats_tab.        "Collect hours

  MOVE: cats_tab-pernr     TO cats_tab_col-pernr,
        cats_tab-catshours TO cats_tab_col-catshours,
        cats_tab-status    TO cats_tab_col-status.
 
  COLLECT cats_tab_col.
ENDLOOP.

...

Regards,

Ferry Lianto

10 REPLIES 10

Former Member
0 Kudos

Hi Tal

1. Use Binary Search with Read Table and field symbols whereever possible.

2. Delete adjacent duplicates from f1_tab comparing PERNR. Then re-organize the following way:

Delete adjacent duplicates from f1_tab.

select * from catsdb for all entries in f1_tab (basically removing the select from the loop)

Loop at catsdb internal table

read table check_flex...etc continue with the procesing

Regards

Arun

Former Member
0 Kudos

Hi TAL S...

I found some improvements u can put in ur code....

1) Instead of select *........Select only those fields which you need in your code....

it helps in performance improvement..

2) try to avoid select statement inside the loop.....You can use this select statements outside the loop....and get the required data into internal table...

3) Before using "FOR ALL ENTRIES" in select statement ...always check the internal table which you are using with FOR ALL ENTRIES have atleast one record or not....

Former Member
0 Kudos

HI

Don't use SELECT stamenet in LOOP because it will reduce the performance

see these points

Ways of Performance Tuning

1. Selection Criteria

2. Select Statements

• Select Queries

• SQL Interface

• Aggregate Functions

• For all Entries

Select Over more than one internal table

Selection Criteria

1. Restrict the data to the selection criteria itself, rather than filtering it out using the ABAP code using CHECK statement.

2. Select with selection list.

SELECT * FROM SBOOK INTO SBOOK_WA.

CHECK: SBOOK_WA-CARRID = 'LH' AND

SBOOK_WA-CONNID = '0400'.

ENDSELECT.

The above code can be much more optimized by the code written below which avoids CHECK, selects with selection list

SELECT CARRID CONNID FLDATE BOOKID FROM SBOOK INTO TABLE T_SBOOK

WHERE SBOOK_WA-CARRID = 'LH' AND

SBOOK_WA-CONNID = '0400'.

Select Statements Select Queries

1. Avoid nested selects

SELECT * FROM EKKO INTO EKKO_WA.

SELECT * FROM EKAN INTO EKAN_WA

WHERE EBELN = EKKO_WA-EBELN.

ENDSELECT.

ENDSELECT.

The above code can be much more optimized by the code written below.

SELECT PF1 PF2 FF3 FF4 INTO TABLE ITAB

FROM EKKO AS P INNER JOIN EKAN AS F

ON PEBELN = FEBELN.

Note: A simple SELECT loop is a single database access whose result is passed to the ABAP program line by line. Nested SELECT loops mean that the number of accesses in the inner loop is multiplied by the number of accesses in the outer loop. One should therefore use nested SELECT loops only if the selection in the outer loop contains very few lines or the outer loop is a SELECT SINGLE statement.

2. Select all the records in a single shot using into table clause of select statement rather than to use Append statements.

SELECT * FROM SBOOK INTO SBOOK_WA.

CHECK: SBOOK_WA-CARRID = 'LH' AND

SBOOK_WA-CONNID = '0400'.

ENDSELECT.

The above code can be much more optimized by the code written below which avoids CHECK, selects with selection list and puts the data in one shot using into table

SELECT CARRID CONNID FLDATE BOOKID FROM SBOOK INTO TABLE T_SBOOK

WHERE SBOOK_WA-CARRID = 'LH' AND

SBOOK_WA-CONNID = '0400'.

3. When a base table has multiple indices, the where clause should be in the order of the index, either a primary or a secondary index.

To choose an index, the optimizer checks the field names specified in the where clause and then uses an index that has the same order of the fields. In certain scenarios, it is advisable to check whether a new index can speed up the performance of a program. This will come handy in programs that access data from the finance tables.

4. For testing existence, use Select.. Up to 1 rows statement instead of a Select-Endselect-loop with an Exit.

SELECT * FROM SBOOK INTO SBOOK_WA

UP TO 1 ROWS

WHERE CARRID = 'LH'.

ENDSELECT.

The above code is more optimized as compared to the code mentioned below for testing existence of a record.

SELECT * FROM SBOOK INTO SBOOK_WA

WHERE CARRID = 'LH'.

EXIT.

ENDSELECT.

5. Use Select Single if all primary key fields are supplied in the Where condition .

If all primary key fields are supplied in the Where conditions you can even use Select Single.

Select Single requires one communication with the database system, whereas Select-Endselect needs two.

Select Statements SQL Interface

1. Use column updates instead of single-row updates

to update your database tables.

SELECT * FROM SFLIGHT INTO SFLIGHT_WA.

SFLIGHT_WA-SEATSOCC =

SFLIGHT_WA-SEATSOCC - 1.

UPDATE SFLIGHT FROM SFLIGHT_WA.

ENDSELECT.

The above mentioned code can be more optimized by using the following code

UPDATE SFLIGHT

SET SEATSOCC = SEATSOCC - 1.

2. For all frequently used Select statements, try to use an index.

SELECT * FROM SBOOK CLIENT SPECIFIED INTO SBOOK_WA

WHERE CARRID = 'LH'

AND CONNID = '0400'.

ENDSELECT.

The above mentioned code can be more optimized by using the following code

SELECT * FROM SBOOK CLIENT SPECIFIED INTO SBOOK_WA

WHERE MANDT IN ( SELECT MANDT FROM T000 )

AND CARRID = 'LH'

AND CONNID = '0400'.

ENDSELECT.

3. Using buffered tables improves the performance considerably.

Bypassing the buffer increases the network considerably

SELECT SINGLE * FROM T100 INTO T100_WA

BYPASSING BUFFER

WHERE SPRSL = 'D'

AND ARBGB = '00'

AND MSGNR = '999'.

The above mentioned code can be more optimized by using the following code

SELECT SINGLE * FROM T100 INTO T100_WA

WHERE SPRSL = 'D'

AND ARBGB = '00'

AND MSGNR = '999'.

Select Statements Aggregate Functions

• If you want to find the maximum, minimum, sum and average value or the count of a database column, use a select list with aggregate functions instead of computing the aggregates yourself.

Some of the Aggregate functions allowed in SAP are MAX, MIN, AVG, SUM, COUNT, COUNT( * )

Consider the following extract.

Maxno = 0.

Select * from zflight where airln = ‘LF’ and cntry = ‘IN’.

Check zflight-fligh > maxno.

Maxno = zflight-fligh.

Endselect.

The above mentioned code can be much more optimized by using the following code.

Select max( fligh ) from zflight into maxno where airln = ‘LF’ and cntry = ‘IN’.

Select Statements For All Entries

• The for all entries creates a where clause, where all the entries in the driver table are combined with OR. If the number of entries in the driver table is larger than rsdb/max_blocking_factor, several similar SQL statements are executed to limit the length of the WHERE clause.

The plus

• Large amount of data

• Mixing processing and reading of data

• Fast internal reprocessing of data

• Fast

The Minus

• Difficult to program/understand

• Memory could be critical (use FREE or PACKAGE size)

Points to be must considered FOR ALL ENTRIES

• Check that data is present in the driver table

• Sorting the driver table

• Removing duplicates from the driver table

Consider the following piece of extract

Loop at int_cntry.

Select single * from zfligh into int_fligh

where cntry = int_cntry-cntry.

Append int_fligh.

Endloop.

The above mentioned can be more optimized by using the following code.

Sort int_cntry by cntry.

Delete adjacent duplicates from int_cntry.

If NOT int_cntry[] is INITIAL.

Select * from zfligh appending table int_fligh

For all entries in int_cntry

Where cntry = int_cntry-cntry.

Endif.

Select Statements Select Over more than one Internal table

1. Its better to use a views instead of nested Select statements.

SELECT * FROM DD01L INTO DD01L_WA

WHERE DOMNAME LIKE 'CHAR%'

AND AS4LOCAL = 'A'.

SELECT SINGLE * FROM DD01T INTO DD01T_WA

WHERE DOMNAME = DD01L_WA-DOMNAME

AND AS4LOCAL = 'A'

AND AS4VERS = DD01L_WA-AS4VERS

AND DDLANGUAGE = SY-LANGU.

ENDSELECT.

The above code can be more optimized by extracting all the data from view DD01V_WA

SELECT * FROM DD01V INTO DD01V_WA

WHERE DOMNAME LIKE 'CHAR%'

AND DDLANGUAGE = SY-LANGU.

ENDSELECT

2. To read data from several logically connected tables use a join instead of nested Select statements. Joins are preferred only if all the primary key are available in WHERE clause for the tables that are joined. If the primary keys are not provided in join the Joining of tables itself takes time.

SELECT * FROM EKKO INTO EKKO_WA.

SELECT * FROM EKAN INTO EKAN_WA

WHERE EBELN = EKKO_WA-EBELN.

ENDSELECT.

ENDSELECT.

The above code can be much more optimized by the code written below.

SELECT PF1 PF2 FF3 FF4 INTO TABLE ITAB

FROM EKKO AS P INNER JOIN EKAN AS F

ON PEBELN = FEBELN.

3. Instead of using nested Select loops it is often better to use subqueries.

SELECT * FROM SPFLI

INTO TABLE T_SPFLI

WHERE CITYFROM = 'FRANKFURT'

AND CITYTO = 'NEW YORK'.

SELECT * FROM SFLIGHT AS F

INTO SFLIGHT_WA

FOR ALL ENTRIES IN T_SPFLI

WHERE SEATSOCC < F~SEATSMAX

AND CARRID = T_SPFLI-CARRID

AND CONNID = T_SPFLI-CONNID

AND FLDATE BETWEEN '19990101' AND '19990331'.

ENDSELECT.

The above mentioned code can be even more optimized by using subqueries instead of for all entries.

SELECT * FROM SFLIGHT AS F INTO SFLIGHT_WA

WHERE SEATSOCC < F~SEATSMAX

AND EXISTS ( SELECT * FROM SPFLI

WHERE CARRID = F~CARRID

AND CONNID = F~CONNID

AND CITYFROM = 'FRANKFURT'

AND CITYTO = 'NEW YORK' )

AND FLDATE BETWEEN '19990101' AND '19990331'.

ENDSELECT.

1. Table operations should be done using explicit work areas rather than via header lines.

2. Always try to use binary search instead of linear search. But don’t forget to sort your internal table before that.

3. A dynamic key access is slower than a static one, since the key specification must be evaluated at runtime.

4. A binary search using secondary index takes considerably less time.

5. LOOP ... WHERE is faster than LOOP/CHECK because LOOP ... WHERE evaluates the specified condition internally.

6. Modifying selected components using “ MODIFY itab …TRANSPORTING f1 f2.. “ accelerates the task of updating a line of an internal table.

Point # 2

READ TABLE ITAB INTO WA WITH KEY K = 'X‘ BINARY SEARCH.

IS MUCH FASTER THAN USING

READ TABLE ITAB INTO WA WITH KEY K = 'X'.

If TAB has n entries, linear search runs in O( n ) time, whereas binary search takes only O( log2( n ) ).

Point # 3

READ TABLE ITAB INTO WA WITH KEY K = 'X'. IS FASTER THAN USING

READ TABLE ITAB INTO WA WITH KEY (NAME) = 'X'.

Point # 5

LOOP AT ITAB INTO WA WHERE K = 'X'.

" ...

ENDLOOP.

The above code is much faster than using

LOOP AT ITAB INTO WA.

CHECK WA-K = 'X'.

" ...

ENDLOOP.

Point # 6

WA-DATE = SY-DATUM.

MODIFY ITAB FROM WA INDEX 1 TRANSPORTING DATE.

The above code is more optimized as compared to

WA-DATE = SY-DATUM.

MODIFY ITAB FROM WA INDEX 1.

7. Accessing the table entries directly in a "LOOP ... ASSIGNING ..." accelerates the task of updating a set of lines of an internal table considerably

8. If collect semantics is required, it is always better to use to COLLECT rather than READ BINARY and then ADD.

9. "APPEND LINES OF itab1 TO itab2" accelerates the task of appending a table to another table considerably as compared to “ LOOP-APPEND-ENDLOOP.”

10. “DELETE ADJACENT DUPLICATES“ accelerates the task of deleting duplicate entries considerably as compared to “ READ-LOOP-DELETE-ENDLOOP”.

11. "DELETE itab FROM ... TO ..." accelerates the task of deleting a sequence of lines considerably as compared to “ DO -DELETE-ENDDO”.

Point # 7

Modifying selected components only makes the program faster as compared to Modifying all lines completely.

e.g,

LOOP AT ITAB ASSIGNING <WA>.

I = SY-TABIX MOD 2.

IF I = 0.

<WA>-FLAG = 'X'.

ENDIF.

ENDLOOP.

The above code works faster as compared to

LOOP AT ITAB INTO WA.

I = SY-TABIX MOD 2.

IF I = 0.

WA-FLAG = 'X'.

MODIFY ITAB FROM WA.

ENDIF.

ENDLOOP.

Point # 8

LOOP AT ITAB1 INTO WA1.

READ TABLE ITAB2 INTO WA2 WITH KEY K = WA1-K BINARY SEARCH.

IF SY-SUBRC = 0.

ADD: WA1-VAL1 TO WA2-VAL1,

WA1-VAL2 TO WA2-VAL2.

MODIFY ITAB2 FROM WA2 INDEX SY-TABIX TRANSPORTING VAL1 VAL2.

ELSE.

INSERT WA1 INTO ITAB2 INDEX SY-TABIX.

ENDIF.

ENDLOOP.

The above code uses BINARY SEARCH for collect semantics. READ BINARY runs in O( log2(n) ) time. The above piece of code can be more optimized by

LOOP AT ITAB1 INTO WA.

COLLECT WA INTO ITAB2.

ENDLOOP.

SORT ITAB2 BY K.

COLLECT, however, uses a hash algorithm and is therefore independent

of the number of entries (i.e. O(1)) .

Point # 9

APPEND LINES OF ITAB1 TO ITAB2.

This is more optimized as compared to

LOOP AT ITAB1 INTO WA.

APPEND WA TO ITAB2.

ENDLOOP.

Point # 10

DELETE ADJACENT DUPLICATES FROM ITAB COMPARING K.

This is much more optimized as compared to

READ TABLE ITAB INDEX 1 INTO PREV_LINE.

LOOP AT ITAB FROM 2 INTO WA.

IF WA = PREV_LINE.

DELETE ITAB.

ELSE.

PREV_LINE = WA.

ENDIF.

ENDLOOP.

Point # 11

DELETE ITAB FROM 450 TO 550.

This is much more optimized as compared to

DO 101 TIMES.

DELETE ITAB INDEX 450.

ENDDO.

12. Copying internal tables by using “ITAB2[ ] = ITAB1[ ]” as compared to “LOOP-APPEND-ENDLOOP”.

13. Specify the sort key as restrictively as possible to run the program faster.

Point # 12

ITAB2[] = ITAB1[].

This is much more optimized as compared to

REFRESH ITAB2.

LOOP AT ITAB1 INTO WA.

APPEND WA TO ITAB2.

ENDLOOP.

Point # 13

“SORT ITAB BY K.” makes the program runs faster as compared to “SORT ITAB.”

Internal Tables contd…

Hashed and Sorted tables

1. For single read access hashed tables are more optimized as compared to sorted tables.

2. For partial sequential access sorted tables are more optimized as compared to hashed tables

Hashed And Sorted Tables

Point # 1

Consider the following example where HTAB is a hashed table and STAB is a sorted table

DO 250 TIMES.

N = 4 * SY-INDEX.

READ TABLE HTAB INTO WA WITH TABLE KEY K = N.

IF SY-SUBRC = 0.

" ...

ENDIF.

ENDDO.

This runs faster for single read access as compared to the following same code for sorted table

DO 250 TIMES.

N = 4 * SY-INDEX.

READ TABLE STAB INTO WA WITH TABLE KEY K = N.

IF SY-SUBRC = 0.

" ...

ENDIF.

ENDDO.

Point # 2

Similarly for Partial Sequential access the STAB runs faster as compared to HTAB

LOOP AT STAB INTO WA WHERE K = SUBKEY.

" ...

ENDLOOP.

This runs faster as compared to

LOOP AT HTAB INTO WA WHERE K = SUBKEY.

" ...

ENDLOOP.

former_member194613
Active Contributor
0 Kudos

This SELECT is completely wrong!

SELECT *

FROM catsdb

INTO CORRESPONDING FIELDS OF TABLE cats_tab

FOR ALL ENTRIES IN fl_tab

WHERE pernr = wa_fl_tab-pernr

AND workdate IN so_date1 .

The select works with the workarea wa_fl_tab-pernr and not with tha table.

You should either do a SELECT FOR ALL ENTRIES outside of the Loop or a simple SELECT with workarea inside the loop.

A SELECT can be o.k., inside a loop if the where condition can only be determined inside the loop, which seems to be the case here.

Run SE30 (with internal tables) and ST05 which will guide you to the bottlenecks

see here

SQL trace:

/people/siegfried.boes/blog/2007/09/05/the-sql-trace-st05-150-quick-and-easy

SE30

/people/siegfried.boes/blog/2007/11/13/the-abap-runtime-trace-se30--quick-and-easy

Siegfried

Former Member
0 Kudos

Hi TAL S,

I believe that there are serious flaws in your logic (and this has nothing to do with the performance). I do not believe that this piece of code will do what it is intended to do.

1) You seem to be appending to a select-option. With each iteration this select option will grow larger and larger with past data. There is no refresh statement. You are using the date criteria of previous records (person A) to validate person B. I believe that this logic needs revisiting.

2) The select for all entries on table catsdb will not even compile because you are not using the table in the for all entries clause in your where clause. Gathering from this I assume that this is not an existing piece of code but something you are trying to introduce. Let me raise red flag about the functionality (logic) first before I can help you with the performance.

3) You seem to be loop twice into table fl_tab one inside the other. This again is not the way to do it. I do not see why you need to have a nested loop here. Why can't you run this second loop after the first.

I would love to help you with your performance tuning but I strongly believe that your logic is incorrect. I will advice you to concentrate on your logic and get it to work correctly first. Once that is done I would be more than glad to tune the code for you.

0 Kudos

Tal - Mark's right. It makes no sense to try to tune a program that doesn't do what it's supposed to do. Please follow these steps before asking the forum for performance help:

Get the program to pass a syntax check.

Get it working (i.e. test - get users involved if needed).

Do your own performance analysis to determine problems spots.

Go to the forum with problems you can't solve.

Rob

0 Kudos

hi mark

thanks

i change it ,take a look please.

Regards


LOOP AT fl_tab INTO wa_fl_tab.
 
      AT NEW pernr.
 
 
        READ TABLE check_flex INTO wa_check_flex WITH KEY schkz  = 'FLEX'       pernr = wa_fl_tab-pernr .
 
        IF wa_check_flex-begda > so_date-low.

        refresh : so_date1.
 
          so_date1-low = wa_check_flex-begda.          
          so_date1-high = so_date-high.
          so_date1-sign = 'I'.
          so_date1-option = 'BT'.
 
          APPEND so_date1. 
 
          SELECT *
          FROM catsdb
          INTO CORRESPONDING FIELDS OF TABLE cats_tab           
          WHERE pernr = wa_fl_tab-pernr
          AND  workdate IN so_date1 .
 
        ELSE.
 
 
          SELECT *
          FROM catsdb
          INTO CORRESPONDING FIELDS OF TABLE cats_tab 
          WHERE pernr = wa_fl_tab-pernr
          AND  workdate IN so_date .
 
        ENDIF.
 
 
        LOOP AT cats_tab ."Collect hours
 
          MOVE: cats_tab-pernr  TO cats_tab_col-pernr,
                cats_tab-catshours TO cats_tab_col-catshours,
                cats_tab-status TO cats_tab_col-status.
 
          COLLECT cats_tab_col .
 
        ENDLOOP.
 
 
      ENDAT.
 
    ENDLOOP


        LOOP AT fl_tab ASSIGNING <fl_tab>.
 
          READ TABLE cats_tab_col ASSIGNING <cats_tab_col> WITH KEY pernr = <fl_tab>-pernr.
          IF sy-subrc = 0.
            <cats_tab_col>-teken_hr = <fl_tab>-mostd.
 
          ENDIF.
 
        ENDLOOP.

0 Kudos

Still not sure about your logic (it does look better). I've marked my changes, so try:

SORT check_flex BY schkz pernr.     "<======

LOOP AT fl_tab INTO wa_fl_tab.
  AT NEW pernr.
    READ TABLE check_flex INTO wa_check_flex
      WITH KEY schkz = 'FLEX'
               pernr = wa_fl_tab-pernr
      BINARY SEARCH.                "<======
    IF wa_check_flex-begda > so_date-low.
      REFRESH : so_date1.
      so_date1-low = wa_check_flex-begda.
      so_date1-high = so_date-high.
      so_date1-sign = 'I'.
      so_date1-option = 'BT'.
      APPEND so_date1.
      SELECT *
      FROM catsdb
      APPENDING CORRESPONDING FIELDS OF TABLE cats_tab    "<=====
      WHERE pernr = wa_fl_tab-pernr
      AND  workdate IN so_date1 .
    ELSE.
      SELECT *
      FROM catsdb
      APPENDING CORRESPONDING FIELDS OF TABLE cats_tab    "<=====
      WHERE pernr = wa_fl_tab-pernr
      AND  workdate IN so_date .
    ENDIF.
  ENDAT.
ENDLOOP.

* Moved out of loopl                                     "<======
LOOP AT cats_tab ."Collect hours
  MOVE: cats_tab-pernr  TO cats_tab_col-pernr,
        cats_tab-catshours TO cats_tab_col-catshours,
        cats_tab-status TO cats_tab_col-status.
  COLLECT cats_tab_col .
ENDLOOP.

SORT cats_tab_col BY pernr.     "<=======
LOOP AT fl_tab ASSIGNING <fl_tab>.
  READ TABLE cats_tab_col ASSIGNING <cats_tab_col>
    WITH KEY pernr = <fl_tab>-pernr
    BINARY SEARCH.             "<=======
  IF sy-subrc = 0.
    <cats_tab_col>-teken_hr = <fl_tab>-mostd.
  ENDIF.
ENDLOOP.

OK - I moved the remaining nested loop outside of the inner one.

Rob

Message was edited by:

Rob Burbank

ferry_lianto
Active Contributor
0 Kudos

Hi,

Please try this ...


RANGES: r_pernr1 for catsdb-pernr,
        r_pernr2 for catsdb-pernr.

LOOP AT fl_tab INTO wa_fl_tab.

  AT NEW pernr.
 
    READ TABLE check_flex INTO wa_check_flex 
               WITH KEY schkz  = 'FLEX'       
                        pernr  = wa_fl_tab-pernr .
 
    IF wa_check_flex-begda > so_date-low.
 
      so_date1-low = wa_check_flex-begda.          
      so_date1-high = so_date-high.
      so_date1-sign = 'I'.
      so_date1-option = 'BT'.
      APPEND so_date1. 

      r_pernr1-sign = 'I'.
      r_pernr1-option = 'EQ'.
      r_pernr1 = wa_fl_tab-pernr.
      APPEND r_pernr1.         
          
    ELSE.
   
      r_pernr2-sign = 'I'.
      r_pernr2-option = 'EQ'.
      r_pernr2 = wa_fl_tab-pernr.
      APPEND r_pernr2. 
 
    ENDIF.   
  ENDAT.
ENDLOOP.       
       
SELECT *
FROM catsdb
INTO CORRESPONDING FIELDS OF TABLE cats_tab           
WHERE pernr in r_pernr1
  AND  workdate IN so_date1.
 
 
SELECT *
FROM catsdb
APPENDING CORRESPONDING FIELDS OF TABLE cats_tab 
WHERE pernr IN r_pernr2.
  AND workdate IN so_date.

LOOP AT cats_tab.        "Collect hours

  MOVE: cats_tab-pernr     TO cats_tab_col-pernr,
        cats_tab-catshours TO cats_tab_col-catshours,
        cats_tab-status    TO cats_tab_col-status.
 
  COLLECT cats_tab_col.
ENDLOOP.

...

Regards,

Ferry Lianto

former_member194613
Active Contributor
0 Kudos

unfortunately the solution has stilkl flaws:

+ the ranges work only for 1000 entries

+ and the missing binary search makes the first loop to nonlinear coding

quite suboptimal