Skip to Content

Archived discussions are read-only. Learn more about SAP Q&A

BODS Jobs :

Hi.

Am trying to understand the time taken by various jobs on BODS existing schedules.

Of late we are seeing that some jobs which are culling data from SAP R/3 are taking too much time.

When I look at the jobs and each job has some atleast 20-25 DF. on an average.

When i look at the console - against each job, start time. end time is explicitly visible and duration says... what is the total time taken for that particular job.

When we look at the trace, it shows line by line each DF.

BUT.

Looking at the trace, I have to figure out - and calculate, how long each DF is actually taking to run, looking at each start time and each end time of a DF.

I would like to know whether there could be a easy command to include in any script, which calculates the duration of each internal dataflow, at every block.

I think, it could be made possible, just that am not understanding, how this could be easily made visible.

Though for now we have explicitly checked the time taken for each DF and are monitoring the same.

Any direct column on trace which will tell the duration of each DF - would be great.

===========================

For e.g., in TRACE this what i see :

(12.1) 05-25-11 12:50:45 (1812:3368) DATAFLOW: Process to execute data flow <DF_DPR_PART> is started.

(12.1) 05-25-11 12:50:45 (1812:3368) DATAFLOW: Data flow <DF_DPR_PART> is started. (12.1) 05-25-11 12:50:45 (1812:3368) ABAP: ABAP flow <R3_ZBI_RE_DPR_PART_BODS> is started.

(12.1) 05-25-11 12:50:45 (1812:3368) ABAP: Begin executing ABAP program <ZBI_RE_DPR_PART_BODS>.

(12.1) 05-25-11 12:50:51 (1812:3368) ABAP: End executing ABAP program <ZBI_RE_DPR_PART_BODS>.

(12.1) 05-25-11 12:50:51 (1812:3368) ABAP: ABAP flow <R3_ZBI_RE_DPR_PART_BODS> is completed.

(12.1) 05-25-11 12:50:51 (1812:3368) DATAFLOW: Cache statistics determined that data flow <DF_DPR_PART> uses <0> caches with a total size of <0> bytes. This is less than(or

equal to) the virtual memory <1606418432> bytes available for caches. Statistics is switching the cache type to IN MEMORY.

(12.1) 05-25-11 12:50:51 (1812:3368) DATAFLOW: Data flow <DF_DPR_PART> using IN MEMORY Cache.

(12.1) 05-25-11 12:50:55 (1812:3368) DATAFLOW: Data flow <DF_DPR_PART> is completed successfully.

========================================================================================

There is no problem as long as DF run within few minutes.

Here job DATA FLOW says started. AND later withing few seconds says completed successfully.

Wonderful. At the end after that i would like to have a duration of time taken to be displayed.

How do i do that ? So that i dont have to keep a manual matrix of each flow.

Because when certain jobs run for 2 hrs or 4 hrs or 8 hrs at SAP R/3 level - that is when, we find it a bit difficult and the whole WF/DF tuning is called for.

We have checked and sorted out the problem, but any direct visibility of each Block duration of time taken, is what i wish to incorporate.

=================

Any advise please.

Thanks

indu

Edited by: Indumathy Narayanan on Jun 15, 2011 9:36 AM

Former Member replied

Hi,

Not sure what version of BODS you're using but my version is 12.1 and the Administrator app on the Management Console provides a very useful 'Performance Monitor' for each job execution. It's a link next to each job status on the Batch Job Status view under 'Job Information'. You can view as a graph or detail and it shows each DFs runtime.

0 View this answer in context
Not what you were looking for? View more on this topic or Ask a question