on 01-16-2013 10:30 PM
Hi,
We are looking to measure performance of Process Chains in DP. The evaluation parameters are time taken to execute & the amount of data processed during the run. How to do the same?
Also is it possible to evaluate the performance of all the jobs that are part of the process chains separately? If yes..then how?
regards,
Mohit
Mohit,
Process chain has logs. check logs to see the time taken for each variant. check the volume of data used in each variant. Tabulate this for a month and see if you get predictable information on run times. Use parallelization in all DP background jobs. You also need to pay attention to available work processes when the process chain runs. You can save logs for as long as you wish but this consumes performance again...esp. DB logs getting full is not a good sign.
In general, most time taken in process chains is in running macros in background. This is esp. if the level of aggregation used for macros is more granular and the time horion is "Total" and macro is set to "details all" mode. You cant do much to improve this at "fucntional" level.. apart for parallelization.
If your objective is to make things fast then you need to have a strategy in place to break process chains into sensible chunks and run them at sensible times, depening on what planners expect in interactive planning and how often. e.g. may be you need not run some macros everyday. You need to run statistical forecast only once a month or once every 15 days. This way performance can only be a pain some times but not all times. .
Hope this helps in someway.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Sorry Mohit,
I missed to see you ask a question
How & where I can see the volume of data generated in each of the variant?
Right click on the process chain job and display the variant. e.g. if this is a macro, you will see in the resp. job the number of planning objects by clicking on the number of planning objects. you then go to the data view and find out how many iterations does this macro have. This is a rough way to know the data volumes processed by each variant. similarly for other jobs like e.g. create CVC's you can check the number of char combs the job is generating
Another way is check the spool
Another way is check logs...if logs are checked in the job.
If you think the volumes per job are large and it takes a lot of time, you can break down your jobs and run them in parallel. This would mean re-engineering the process chains.
Hope this helps you visualize and connect to your ultimate objectives
Let me know if you have any other questions.
User | Count |
---|---|
13 | |
4 | |
3 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.