cancel
Showing results for 
Search instead for 
Did you mean: 

Long runtime for VBUK_2 header table update

former_member204080
Active Contributor
0 Kudos

Hi Team,

We are currenlty performing a time based reduction scenario to build
our pre-production system from a staging server(copy of production).

We are stuck up with Issue where updation for VBUK_2 has been taking
longtime .This step started on 26th january and it is still running (50
hours ) but this doesnt seem to complete now

We have set all the required parameters for increasing the performace of fill header as below

We also have created an index ZM1 to include below values for VBFA as per note 1256679

MANDT
VBELN
VBELV
POSNV

No change was observerd after above activity.Attached image has the parameter settings

Please suggest if there is any other parameter/settign that needs to be done to improve performance

TDMS version 4.0 SP3

Table:VBUK

Total No of entries: 45,809,350

Table:VBFA

Total No of entries: 282495,909

Source:ECC6 EHP4

Reciever:ECC6 EHP4

Note:Same refresh activity was performed last year when source DB size was 4.6 TB now it is 5.2 TB

last time it completed in 16 hrs but now it has been running from past 4 days still not yet completed

It seems to be that data has been split into 5549 portions and job is filling data for each partition .But once the portion is imported it is again scanning the rest of the entries in sequence and once it reaches next portion(1213) then the import is being completed.We cancelled the above run as well.For each portion it was taking 10 minutes .After updating DB stats for cluster table it improved a bit but not sufficient.If I calculate based on above portions and time taken it is still needing 95 hrs approx. to complete the fill header .

So we changed Settings present for FILL VBUK_2.Previosusly there were different settings we changed P_PARA to 8 from4  and P_SELECT=R from FULL.

Please suggest

Regards,

Murali

Accepted Solutions (0)

Answers (2)

Answers (2)

former_member204080
Active Contributor
0 Kudos

Hi Tanu,

Sorry message number is 80548.Yes we have created the index and restarted fill header for VBUK .But still issue was there

I got a reply for meesage that they released a note yesterday

1956865 - P_PARA value is set to a value with leading spaces by Data Selection and activity runs longer

This is to make sure that TDMS uses P_PARA parameter as we set it.Previpsuly we set it as 8 but still it was 1 process to update cluster.

After implementing notes the execution seems to have picked up and its running fast.

I will monitor for next few hours and inform you if its completed.

Regards,

Murali

Avenger_Girl23
Employee
Employee
0 Kudos

Hi Murli,

If the parameter is set for one of the SO then it has to be set for others also, otherwise the first one will be started and P_CLU should not be changed during the runtime rather before.. for the activity. The P_PARA can be changed anytime.

On the same it matters what is the Time Slice you have choosen for the transfer, if you have selected more than 4 months then it will definately take time and the most you can do is to check your resouces.

In future.. you can start this activity with 10 - 15 work process. You need to go to the Batch Processing Tab --> Then Detail Settings -> Add the activity -> set the no of job.

This you can do also for the steps like the deletion and selection of data.

Sys Paramter rdisp_max_wprun_time to 7200

Regards,

Bharti

former_member204080
Active Contributor
0 Kudos

Hi Bharati,

Yes we have kept the parameter P_CLU as Y for VBUK,VBUK_1 and it was set to VBUK_2 as well.

We have kept the time slices as 2 years transactional data .Number of records might be high but we did the same activity last year when our total DB size was 4.6 TB .During then same fill header ran for 15 hrs and got completed on TDMS 3.0 .We didnt create Index also last time

We have enough no of background and dialog workprocess(20 each) in all the participating systems

But will a fill header task run using more background jobs if we add in Batch processing tab.

Is P_PARA not the desired parameter for starting fill header in parallel background jobs

After implementing below notes also it seems to be taking time.

Could you please explain in detail use of P_PARA parameter.Will it start processing fill header request with 8(P_PARA=8) parallel background jobs or a sinlge background job uses 8 dialog workprocess to update the cluster portion details

1956865 - P_PARA value is set to a value with leading spaces by Data Selection and activity runs longer

Regards,

Murali


former_member204080
Active Contributor
0 Kudos

Hi All,

Could you please clarify if below understanding is right

Currently my package is stuck up at updation of TDMS header tables (which determines header entries for objects for data reduction so it just scanning the tables and based on the time slice provided it is determining the no of historical entries which it has to copy to receiver and storing in header table to keep business process data consistent .since this data is not directly available TDMS has to calculate and fill  ).

When i started above step it parallel started data selection for TDMS header tables(where actual historical data(sales order and its predeccing no etc) is being collected from the sender system tables and stored in a cluster table).out of 88 objects it collected 45 and 33 objects are not yet started

been started ,status of this step is showing as scheduled.IN PEM job logs its showing below

Activity Identification of Relevant Sales History Documents has been executed by PEM

No type variant exists for the current notification object

so since data selection will only start for objects which are filled completely and as VBUK_2 is not filled i guess it showing above error

when data selection for header tables was 45% i started data selection for all tables and it got completed 100%.

My queries are

1.Is Data selection for all tables independent of fill header and data selction of tdms header tables

As it got completed 100% but when compared entries in job log of thiswith DTLMON there were 33 objects in uncalculated status(Are these 33 from TDMS header tables)

2.Updation of VBUK_2 ->in process

Data selection for TDMS header tables 57% completed(33 remaining)

data selection for all tables completed

Based on above values can i start the data transfer parallely.If i start now will there be any problem with data inconsistency etc

Please suggest

Regards,

Murali

former_member204080
Active Contributor
0 Kudos

HI All,

We have started data transfer but it took 6 background jobs as those were availble then

I have changed the operation  modes and now 16 background jobs are avilbale

How to start data transfer using 16 background jobs .Should we stop and start the step again

I checked in trouble shooter there was no option to change  during runtime

Regards,

Murali

former_member65049
Active Participant
0 Kudos

Hi Murli,

You can change the number of jobs for each activity using the below option:

Go to menu option,  "Process Settings-> Settings for Batch processing".

In the "detailed settings" tab, you can make entry for data transfer activity.

In the jobs field, make entry for the number of jobs that you want to allocate for this activity.

Hope this helps,

Regards,

Rupam

former_member204080
Active Contributor
0 Kudos

Hi Rupam,

Sorry missed to mentin in previous reply.I have already set no of jobs as 15 during package stiings in Batch processing tab

Issue was when i started data transfer there were only 8 background WP available so it started with 6 background jobs

So i switched operations modes now and total nof os background WP are 16.But still data transfer is running with 6 jobs

is there any option to make data transfer to run with 15 background jobs in runtime

I couldnt see option in troubleshooter ,only way i could see is stop data transfer and start it agian now it should start with 15 jobs

Please suggest

Regards,

Murali


former_member65049
Active Participant
0 Kudos

Hi Murli,

Generally the running jobs take the changed settings.

In case it has not worked in your case, you need to stop the data transfer activity using menu option        "Process settings -> Settings for data transfer".

After this, you need to restart the data transfer activity from process tree.

Regards,
Rupam


Former Member
0 Kudos

Hello Murali,

This is in response to your queries:

1.Is Data selection for all tables independent of fill header and data selction of tdms header tables

As it got completed 100% but when compared entries in job log of thiswith DTLMON there were 33 objects in uncalculated status(Are these 33 from TDMS header tables) :

Yes data selection for all tables is independent of fill headers and data selection for fill headers.

2.Updation of VBUK_2 ->in process

Data selection for TDMS header tables 57% completed(33 remaining)

data selection for all tables completed

Based on above values can i start the data transfer parallely.If i start now will there be any problem with data inconsistency etc

You can start data transfer activity even after data selection is completed only for one object, there will be no inconsistencies. When data transfer activity is started then only those objects are considered for transfer for which data selection is completed.

Hope your VBUK fill header is completed, if its not yet completed then update the current status of the activity, for further analysis.

former_member204080
Active Contributor
0 Kudos

Hi ALL,

Thank you for your responses .Fill header for VBUK got completed after 5 days


Avenger_Girl23
Employee
Employee
0 Kudos

Hi Murli,

In case you have changed the parameters during the step Update, then it might not help.

The parameters have to be changed before you start the activity and also the Large tables should be specified with Size Category 'Large' to start them in Background Job. The tables such as UAIDETAILDATA

RSB_DOC_XML

CE4DC01_ACCT

Including this you can change the system parameters (incase Hardware is available)

Then make sure that you have set the enough Batch Processes.

You can also select the activity and set the number of Batch Process for that activity.

Specify the reduction rules, eg the tables which you are not relevant for transfer (this is a optional activity in the process tree)

Make sure that the following Activity ID have the P_CLU=Y

TD05X_FILL_VBUK_1             

TD05X_FILL_VBUK_2

TD05X_FILL_VBUK also P_POWR    to be 1000

If the Package is still running then I will suggest you to create a new package from beginning.

The step Update the Header Table is a mass starter which means that it will automatically start a new job and try to fill up the header.

Hope the above helps you.. and please create a OSS message so that we can check the system also.

Best regards,

Bharti

former_member204080
Active Contributor
0 Kudos

Hi Bharti,

Thanks for the details.Yes we have raised the OSS message and SAP did a stats update on cluster table CNVTDMS_05_CLU and said that since data is huge TDMS will take time

But will it take more than 4 days in worst case also.

ALL required parameters were setalready.But When update TDMS header tables started it automatically started the data selection of TDMS header tables.Once data selection for TDMS header table was 45% i statred data selection for ALL tables .This got completed(100%) but gave error for 4 tables which were large(time out error).I changed the parameter for these to Large(to run in background) and Did an autorepair of objects .Once this was done remaing 4 tables also data selection completed .

After this i could still see  that Update TDMS header for VBUK_2 is still runnig(after 3 days) so i changed P_PARA to 8 from 4 .This can be done during runtime as well

Then i raised mesage 80584

Regards,

Murali

former_member204080
Active Contributor
0 Kudos


Hi Bharati,

If possible please check the OSS message 80584 we have updated all the details

R/3 and HTTP conections are open

Regards,

Murali

former_member204080
Active Contributor
0 Kudos

Hi Bharati,

Creation of a new package is not a feasible solution right Now.We do not have much time to complete the refresh and handover s/m to business .We have two days left .

If we start freshly that might take longer.However even current package runtime is high(updation of VBUK still needs 95 hrs apprx calcualtion based on no of portions calcualted and imported)

All the parameters for optimum performance were set after system analysis phase is completed

Please help so that we can complete remaining activities in next two days

Regards,

Murali

Former Member
0 Kudos

Hello Murali,

is the OSS message number correct. When I look for this message it is under BC-DB area and is already confirmed. If you could let me know I will quickly ask my colleagues to look at it.

Also there is a note 1256679 which is for a similar issue. Can you please check if this helps.

Thanks and Regards,

Tanu