Skip to Content

Archived discussions are read-only. Learn more about SAP Q&A

the performance of ‘STARTING NEW TASK’ when handling mass data

I will just show some key codes as below:


In function module ‘ZSYL_TT’, I insert data of the internal table into a Z* database table (suppose all key fields are specified):


In this program, I de-package five million records into five internal tables with one million records for each, and do insert by using parallel processing. It took me around 5-6 minutes.

Next, I use a normal means which is sequence processing:

Five internal tables with one million records for each now are handled by sequence, and it took me around 7 minutues.


Maybe this test is not good enough, but anyway, it seems that we don’t get much benefit from parallel processing. One of the reasons I can imagine is the data copy from current process to the new process. This takes time when one million records are copied.


Moreover, I found it seems that STARTING NEW TASK actually make a new login rather than open a new dialog process in the current session. (Correct me if I am wrong.)


So, is there anyone could give more details of the mechanism of STARTING NEW TASK and provide a better way to do that?

Former Member
Not what you were looking for? View more on this topic or Ask a question