cancel
Showing results for 
Search instead for 
Did you mean: 

Process Observer Discussion

0 Kudos

This thread is dedicated to questions and discussions around the Process Observer component that is part of Business Suite.

Process Observer is used in different scenarios, including and prominently with Operational Process Intelligence.

For details on what Process observer is see here.

Accepted Solutions (0)

Answers (24)

Answers (24)

Former Member
0 Kudos

hi,

I'm trying to model CRM activities in process observer. I was wondering if someone has done that before and how it was designed in terms of BO types, activities and tasks.

I'm using POC RAISE EVENT because CRM is not issuing the events I need, and this works well.

I have a few questions:

1)  CRM objects are quite generic, and I can have nesting objects of the same type: Complaint > Follow Up Activity Complaint > ... Follow up Activity General Task > Follow Up Activity Complain, ... and so on, it dépends on the CRM configuration. How to map this CRM model into PO model?

2) count KPIs seems to be quite limited, because it cannot count multiple tasks, unless they are part of the same activity. Is there any solution for this?

3) is it possible to have KPIs calculated upon business object IDs rather than relying on tasks only? because since I have nesting objects, I want counters per task/activity + business object ID. Is this possible?

Many thanks

Erik

0 Kudos

Hi Erik,

I will try to help and give my input to your questions, let's see if this fits.

A general statement: You should always look at your business-driven process definition, which is what you should start with. What do you want/need to see? What granularity of objects and activities is required - for what? You then need to put that into the implementation layer, i.e. see how you get the relevant events/tasks.

1) Generic objects - yes, that's common. Similar thing with generic events ("change" vs. "delivery blocked" or "status changed" etc.). Look at my "general statement" above: Find out what (which objects, activities) you need first. - There are two ways: Either you are able to identifiy the specific object when you raise the event. Or you are able to identify it when you process the event. There is a BAdI: Enhance/Split Tasks, which lets you change the task. So you could get a task GENERIC_OBJECT-UPDATE and you can change that to COMPLAINT-RELEASED; you can even change one task to multiple tasks. The documentation may not be 100% clear, so here some additional advice: The BAdI presents the tasks (is_task) and if you want to change it you put the new content - possibly multiple lines - into CT_TASK. If CT_TASKS is empty, IS_TASK will be processed. - If you raise the events, it can help (if possible) to put all required information into the parameter data of the event so you don't need to read data here; anyway, consider performance.

2) Technical reply: Count KPIs can count activities or tasks. Either you fill out field Activity ID - or fields Business Object Type  and Task Type. If this is what you are looking for. Otherwise … there are, of course, BAdIs for the KPIs.  Apart from the technical solution: Look at my general statement. If you need to count tasks rather than activities … reconsider and check if you have modelled correctly. In the end tasks are technical, they are not well visible in the log. But something you count should probably be visible. You process definition should make sense on a business level. A KPI is a business thing … so it should probably be expressed on that level. But, of course, it is not always that simple.

3) If you have multiple business objects of the same type and you want to count for each business object, try the "Non aggregated" option of the count KPI. The option is available with slightly newer SPs only, so if you have a really old version it may not be there …

Let me know if this helped.

Best regards,

Christoph

Former Member
0 Kudos

Hello Experts,

I have the following question and I couln't found any solution yet:

I would like to create a Process Definition, which would return Sales Orders only for specific business partner to be shown in Process Monitor.

Is that possible?

Thanks,

Marek

0 Kudos

Hi Markek,

first of all, sorry for the delay.

Now: The answer is, of course: It depends.

The simplest scenario is that your start object, i.e. the object of the first activity of your process, determines if you want to log the process. So let's say in your example the process starts with Create Sales Order (VA01). The object type is obviously the sales order (114). So you want to check if the related "business partner" is - let's say - a "special care customer" with special SLAs (that you want to monitor). If you have only this one start activity, the process will not be logged if the start activity is not logged.

To achieve this, you need to go into the "binding". The binding is what happens when the event/task is mapped to the activity. The default is that the task that is assigned to the activity in the process definition (POC_MODEL, view "Task Assignment") always maps to the activity. If you want to introduce a condition here, you can either create a BRFplus-rule in that view ("Create Rule") or implement BAdI POC_MAIN_TASK_BIND in enhancement spot POC_MAIN_RULES.

The example below shows you some code that will suppress the binding for process type MY_TEST if the business object ID is "NO". In your example you would use the business object ID to read the sales order and determine if the customer there is a "special care customer":

METHOD if_poc_process_task_binding~task_to_activity_bind_pre_bo.

    IF cs_task_act_proc_in_bind-process_type_id = 'MY_TEST' and      "just any sample condition

      cs_task_act_proc_in_bind-bo_id = 'NO'.      

      CLEAR cs_task_act_proc_in_bind. "no binding

   ENDIF. 

ENDMETHOD

As you can see, to suppress the binding, you need to clear cs_task_act_proc_in_bind. Please make sure that you don't run into performance issues here. First of all, the BAdI has a filter. You will probably want to filter on process type - in case that you have more than one process running on the machine. Make sure you're not spending too much time in your code. Reading data can be costly. This all depends on your data volume and what you do, of course, be careful and consider what is happening (use "common sense" - but that's kind of difficult with performance).

Things get more difficult, if you have multiple start activities, if you don't know from start if you need to log, if the data to determine if to log or not to log is not easily available etc. The example above describes a simple case, but it shows the basic approach, the conditional binding, to cope with that type of things.

Let me know if you have additional questions and how you get along.

Best regards, Christoph

Former Member
0 Kudos

Hey Jens,

Thank you for your detailed and comprehensive answer.

Luckily I've already figured out the solution and was about to post it here. I've implemented the same method as you have written. The hardest part for me was to figure out how to disable the process to avoid binding(that part with the CLEAR statement). And also it was taught to find the right parameter,to put in the IF condition.

Anyway once again thank you for your effort. And I'm glad that the solution I made, covers with the one you posted.

Cheers,Marek

qurm
Participant
0 Kudos

Hi all,

I have recently started running a POB process in production after some months of prototyping.  While the core process works correctly, in monitoring the activity I am seeing a large number of Unassigned Tasks in POC_TASK and would like some feedback on what is reasonable.

The process is based on Service Noti as a base object, BOR BUS2080, with Type ID 118 and a few events/tasks.  Subsequent events/tasks are configured on Type 468, Maintenance Order, and Type 114, Sales Order.  We have many types of Service Noti, lets say A, B, C, of which only A are required for this process.

In the POB Process Definition I have configured a BRFplus rule on the Binding of Noti Create Activity to the Task Create Noti (21,118).  The purpose of this is to allow only Noti type A to trigger the Process Start, by setting  the value POC_BRF_ACT_BIND_DETERMINED to ‘X’ (true).

I am seeing various classes of Unassigned Tasks in POC_TASK:

  1. Tasks that are in the Process definition, but where the Process Start event for that instance was in the past, so the task/event cannot be correlated to a Start task/event.  This is expected behaviour as the process lifecycle is about 60 days, and after that time most new task/event will be correlated.
  2. Tasks that are in the Process definition, but should be excluded by the BRFPlus rule on the Start Task Create Noti (21,118).  For example Noti type B should not trigger the Start (as POC_BRF_ACT_BIND_DETERMINED is set to false)..  Should these tasks still appear in POC_TASK?  I expected these would be ignored by POB.
  3. Tasks for objects like Maintenance Order, Sales Order.  These objects are in the table POC_C_BOR_EVT, but I do not expect that tasks will be created unless they are related to a process instance.  Should these tasks still appear in POC_TASK as Unassigned?  I expected these would be ignored by POB.
  4. Tasks for a Maintenance Noti BUS2038, which is similar to the Service Noti BUS2080 – this is in the table POC_C_BOR_EVT, but not Active, yet still raises events/tasks.

Any feedback on the above 4 classes of Unassigned Tasks will be helpful.  I assume that POC_MASS_DELETE should be scheduled regularly to delete these unwanted tasks.

Note that the Process logging is set to "Standard Logging"

Thanks,  Andy

bernd_schmitt
Active Participant
0 Kudos

Hi Andy,

we have a BAdI implementation available that suppresses the logging of all the unassigned tasks of any kind. You find it described at the bottom of this article: or in note 2018078. By activating the BAdI implementation you can suppress the orphaned tasks in your productive systems.

Logging the orphaned tasks makes mainly for checking the logging during the design and testing phase.

Regards,

Bernd

P.S. for our internal statistics and references it would be nice, if you could send  me some info about your use case to my email. Thank you!

bryandevaney
Member
0 Kudos

Hi Jens Christoph,

Is it possible to Implement PO on a Java stack or is ABAP stack the only method supported?

if Java can be supported would you have a link on how to implement it?

Best wishes,

Bryan

Jocelyn_Dart
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Bryan

Suggest you post a new discussion in the BPM forum - this one  is over two years old.

Also you need to be careful with your terminology.  PO is generally used for Process Orchestration - the Java based PI+BPM+BRM combination that supersedes PI (Process Integration).

Process Observer is part of ABAP - think of it as a module - I've never seen it referred to as PO.

The information exposed by Process Observer can be used by calling applications- in fact that is core to the Operation Process Intelligence (powered by SAP HANA) solution - which derives underlying process information from ABAP via Process Observer, BPM, and other sources.

Rgds,

Jocelyn

0 Kudos

Hi Bryan,

little to add to Jocelyn.

Maybe: We're trying to use "POB" for Process Observer, but in your question it is quite clear.

And, right: We are very tightly integrated into the applications, so running it on Java is not an option. One thing we do see happening occasionally is that you can have your Java-Application throw the events into a running POB installation, so you can do all the processing available in POB; there is a blog entry ("Direct API") about this - the Java system would be treated like a non-SAP-system in that case. You have to be very careful with that approach concerning volume/performance. This would be for low-volume events, see there for details.

Generally Operational Process Intelligence is the default option when you have events from a non-ABAP-based source that you want to consolidate as a process log, which also integrates POB data - even across different sources within one process chain

Christoph

qurm
Participant
0 Kudos

Hi guys,

Again working with POB and trying to add some new Business Object Types to the POC FACADE; these are Plant Maintenance objects like BUS2038 Maintenance Notification or BUS2080 Service Notification, for example.

1. I select an existing applicable SOA BO Type, and then try to add this as an entry to the "Business Object Type" view.  I was expecting that these would not need to be named as Z-objects as they are SAP standard, but I get a message about the wrong namespace.

Is it normal that i would need to create a ZBUS2080 entry, or should I be able to use BUS2080?

2. The other areas that is unclear is mapping the "Map BOR Event to Task" entry.  For a SAP standard object how do I know what Events (like CHANGED or CREATED) are available?  It all seems to be very trial and error.  Can I use SWEC to examine the events for objects like BUS2038 and relate those to the BOR Events?

It would be great to have some more custom POB examples available, as the provided standard POC examples don't help when we need to add new Facade types etc.

I am working in ECC6.0 with EHP7

Thanks,

Andy

bernd_schmitt
Active Participant
0 Kudos

Hi Andy,

hope this will help you:

1. when you enter a BO Type from the SOA BO Type list into the Business object Type view you get a warning (yellow) that you can ignore, just go ahead with a <return>. When you enter own/custom objects - not yet contained in the SOA BO Types list - you should use the customer namespace to avoid conflicts. Note we have only 5 characters here, and are normally not using the BUSxxx ID here. But you can map it in POC_BOR (I think you have found this). [The BO Type is a more general concept than the BOR object type].

2. In transactions like SWO1 you can browse the existing BOR objects with their defined events.

The availability of an event in the BOR repository however does not automatically mean that is automatically thrown at runtime (that means when you expect it). There may be configuration or specific conditions required to activate it. Therefor it is necessary to run a BOR event trace before using BOR events and check what BOR events are actually available. You can do this using transactions SWELS and SWEL.

For missing events you can still use the direct event API, the generic BOR event or custom BOR events. You find the information here:

To find another sample for a process instrumentation, you may check this document:

Best regards,

Bernd

bernd_schmitt
Active Participant
0 Kudos

Hi Andy,

you find even more information about BO Types and the usage of BOR objects in this document

, p.36ff, p. 54ff, 116ff.

Regards,

Bernd

qurm
Participant
0 Kudos

Thanks Bernd,

That did help me - using SWEL is a key step to getting the right events.

One problem I had was with the way that BOR objects work in ECC was not intuitive in terms of normal OO behaviour - I was expecting that I should refer to the ZBUS2080 object or class, as that had the custom events defined.  In fact I needed to reference the BUS2080 object, and that fired the custom events in SWEL and POB.

Andy

qurm
Participant
0 Kudos

Hi,

I am just getting started with POB in a sandbox, but would like to check my understanding.  If I configure a new process today, then Event data is only available from today, yes?

I am dealing with some processes that run over several months, (I.e. low volume, long timescale, like a project or Work Order) so if want to look back over many months of data, then I need to start capturing the data as soon as possible?

Or is there some way to go back to the start of the year, say, and generate BOR events for all the activity that has already taken place.  (e.g. a Work Order Release may have been done in Jan 2015, and only now is the work completed and TECOed )

Thanks

Andy

bernd_schmitt
Active Participant
0 Kudos

Hi Andy,

you are happy we have thought about scenarios like yours. We have an event API for both BOR and non-BOR (direct) events. You can specify the event (execution) date in the interface of the API, so you can also pass events from the past. Simply runa a report evaluating the existing data (table entries), pass the existing timestamps through the event API to process observer.

The following documentation is available for the event APIs (they are just simple function calls):

- via direct event API (non-BOR):

- via generic BOR event, see:

You will find that you have to fill for both BO type and task_type (event type) with the 'internal' format, not with the BOR object names. Fill the parameter 'executed_at' with the corresponding timestamp.

Best regards,

Bernd

qurm
Participant
0 Kudos

Thanks, Bernd, this sounds like it will help us.  I have not had time to try this yet, but your answer is clear,

Andy

markus_greutter
Participant
0 Kudos

Hi Experts,

we try to monitor a cross-system process between SAP CRM and SAP ERP. The following steps should be tracked:

- Creation of a Lead in CRM

- Creation of an Opportunity in CRM based on the Lead

- Creation of an ERP Quotation based on the Opportunity

and afterwards also the order but we are already struggling with the Quotations.

We implemented the cross-system federation as described in the in-depth workshop-document (page 138ff). But we are not able to link the quotation to the process.

When trying to map the BOR Object (BUS2031 - ERP Quotation) to the Activity the problem is that this Object does not exist in the CRM system which is the Process Registry.

Is there a BAdI that has to be implemented for the cross-system federation? I didn't find it in the document.

Best regards,

Markus

0 Kudos

Hallo Markus,

let's see: First of all, the federation will only work out of the box with certain types of integration, preferably through RFC (or RFC-based) or generally synchronous integration. Otherwise things may be difficult.

Secondly, I'm not sure I understand what you are trying to do. When looking at the workshop (and the workshop is tough stuff) I think you want to configure remote federation (slide 143). This means that you need to configure one process chain and two processes: one in CRM (Lead - Opportunity) and one (the second) in ERP (Quotation - Order). So the BOR object for the quotation is not required in CRM. You only need it in CRM. Please explain! - There are other ways to handle cross-system scenarios, primarily setting up POB in one system and sending over the events into the other, but that depends much on volume, system set-up, what you want to achieve and if and how you have access to the events.

Going one step further (based on assumptions that may be wrong), let me have a look at the problem of federation. POB assumes that the transaction ID of the "outgoing" activity (not necessarily the last in the process) and the transaction ID of the "incoming" activity (this must be the first activity of the second process) are the same. This happens, as said, for RFC and some other integrations. When testing, you may wish to check that: It is a bit tedious and all I can offer is SE16, but you should be able to find the task of the outgoing activity (in CRM) and incoming (in ERP) in the task log (POC_D_BA_LOG) and compare the transaction IDs. If these are equal, federation will occur (the definition must be set up correctly, i.e. both must have the same chain ID, integration points must be defined etc.).

If everything works, you should see the end-to-end process in POC_MONITOR.

Let me know if I got anything wrong.

Best regards,

Christoph

markus_greutter
Participant
0 Kudos

Hello Christoph,

thank you for your quick response!

The integration between CRM and ERP is realized with the Lean Order Interface (LORD) so it is RFC-based.

Here is a more detailed description of what we implemented:

The scenario up to now contains the following steps:

  • Create Lead ( CRM )
  • Create Opportunity ( CRM )
  • Create Quotation ( ERP )

All needed BAdIs for previous_bor and everything else are implemented and working. The steps in the CRM system are logged. The log level is set to "Standard logging”.

CRM is set as Process-Registry. In both systems the RFC-destinations for federation are set.

In the CRM system we create the process definition of the scenario for the creation of lead and opportunity:

As after the creation of the Opportunity the Quotation in ERP is created an Integration Point is assigned to Activity 20CREAOP:

The Realized Process Chain is maintained:

In the ERP system we have a Process Definition with the same ID and he same Version and set up there the step for the creation of the Quotation:

This step has the Inbound Integration Point assigned:

The realized Process Chain is maintained with the same name:

After the processing of the scenario we got the entries in poc_monitor in CRM system

and in poc_monitor in ERP system

As you can see in the CRM system the ERP step “Create Quotation” is missing.

Best regards,

Markus

0 Kudos

Hallo Markus,

if federation was working, you'd see the complete process in either system, so technically it is missing in both. That doesn't help much...

The problem can either be the set-up (Define logical systems for Federation and Set up Process Registry), the process definition (which looks good from what I see; unfortunately POC_MODEL_CHECK does not check any federation... maybe we need to look into that) or the runtime. Since runtime is the worst thing to fix, let's start there.

As described above, I think you need to check the transaction IDs. Using SE16, in table POC_D_BA_LOG try to find the task for activity 20... in CRM and note down the Transaction ID. In ERP do the same for activity 30... If the transaction IDs are different... it is a runtime issue: The event associated with Activity 20... is different from the action that does the actual RFC... for some reason. - If the IDs are the same, it is an issue with the set-up (which is much better, since this will be easier to resolve: Check the set-up, i.e. the logical systems and make sure the RFCs are working).

Best regards,

Christoph

Message was edited by: J.-Christoph Nolte (added table name)

markus_greutter
Participant
0 Kudos

Hallo Christoph,

right, I forgot to mention the transaction IDs and in fact they are different. So let's focus on the runtime.


Jens-Christoph Nolte wrote:

...

If the transaction IDs are different... it is a runtime issue: The event associated with Activity 20... is different from the action that does the actual RFC... for some reason.

...

I'm not sure whether I understand correctly what to check. But let's try.

First the CRM side:

Based on the event trace we found out that the creation of the Opportunity refers to BOR-Object BUS2000111, event CREATED:

This BOR event is Mapped to the task:

And the same task is assigned to activity 20...:

On ERP side:

Do you have an idea?

Thanks a lot for your patience!

Best regards,

Markus

woutdejong
Participant
0 Kudos

Hi Christoph and Bernd,

Question on the timestamps of PObs Activities (execution) and timestamps recorded in the log of the application itself (in our case FS Claims Management).

We noticed in our sandbox system that theses timestamps are not equal, the timestamps in the (change) logs of the application are 1-2 seconds earlier than those in the PObs (in db tables).

This concerns us a bit in general, even though the business impact for this particular case is low.

Is this normal behavior? Can the timestamps be aligned, or is it very much BOR Event dependent?

Cheers, Wout

0 Kudos

Hi Wout,

right. POB is using the time stamp of the BOR event that is used to bring the change document information to POB from the change document creation exit. It should really be using the information from the time stamp to not add in the processing time in the exit (and maybe some of the change document processing, not sure where that time stamp is created). I had to put a break-point into the event creation routine to see the effect, but obviously this can happen, depending on overall system load, type of change document (little/a lot of processing required) etc.

So: I will fix that. If you need the fix, I'd prefer that you raise a ticket (that may also expedite the process).

In practice it should not have much effect. Times will be off by a couple of seconds. KPIs should not depend on a couple of seconds (and I think more than two or three seconds should be very, very rare), but again: You need to judge.

But anyway, this should be fixed.

Best,

Christoph

woutdejong
Participant
0 Kudos

Christoph, ok, you are referring to the Change Document way of raising events, I guess.

But for standard BOR Events it uses the timestamp as seen in SWEL. So unless the Event itself does contains a Timestamp parameter with the actual time the event happened in the application, one has to use the SWEL timestamp.

Indeed, I agree, a few seconds off usually does not have a big impact.

0 Kudos

Hi Wout,

right (again).

Just to confirm this: The SWEL time stamp will not be considered for change document based BOR events only! Regular events will not be touched here at all! (There is no "local" time stamp that I could use anyway). - I'm using the BOR event POCGENERIC.Event_Raised to move the change document information into the BOR connector. What I will do now technically is only for event POCGENERIC.Event_Raised to use the value from (new) parameter TIMESTAMP rather than the "SWEL timestamp" if available. So POCGENERIC.Event_Raised provides the option to pass the value. And I will use that for the change documents. No existing code will be affected by that either (unless you already fill that parameter, which is unlikely, since it doesn't exist).

Regards,

Christoph

Former Member
0 Kudos

Hello Jens-Christoph Nolte,

I need one clarification regarding the Process observer .

In my project requirement we are using the Direct Event API to recording the custom process.

in that we are successfully getting the logs and able to see that in the process monitor( Poc_Monitor).

My question is how can I send the Logs to BI system ?.

Please send us, if you have any step by step procedures how to connect the process observer to  BI system and send the data.

This would be very helpful for us.

Thanks in advance.

Regards,

Shan.

bernd_schmitt
Active Participant
0 Kudos


Hi Shan,

you find the information you are requesting in SAP note 1694446.

In the SAP IMG / Process Observer configuration (transaction POC_CUSTOMIZING) you also find Activity 'BI Content Activation - Activate Business Inteliigence Objects'. Please check also the documentation of this activity.

Best regards,

Bernd

itsrahulmehta
Explorer
0 Kudos

Hi All

I was looking for some blog posts on Process Observer and found this. Now I am pretty sure that experts here will provide answer for my queries

Here is what I want to understand

1. We have procure to pay process running between ERP and SRM. Both the systems are separate installations and its not Business Suite

2. I want to capture PR create and update events in Process Observer. Is PR creation or update will raise BOR events as part of standard process or I need to enable this in ERP system?

3. If BOR event is not possible then can I call direct event api through some BADI to log these events in Process Observer?

4. In SRM for most of Business Objects, we have BOR events. Does this mean we just need to enable Process Observer in SRM and we are good to go or there are other settings required?

Regards

Rahul

0 Kudos

Hi Rahul,

good that you found us.

  1. This is not really a question, right?
  2. We deliver a template for procure-to-pay; it should be in your ERP system. In the process definition (TA POC_MODEL) check process definition PROCURE_TO_PAY (maybe this is not in your client, it's delivered as template content, which is available in client 000; maybe you can get that copied into your test client). - When you look at the content, you can see that the purchase order (001) is mapped to BOR object BUS2012 and the events CREATED, CHANGED and RELEASED are available and mapped to tasks (001-21, 001-88, 001-62) accordingly. The process definition also contains more activities:
    • Create Purchase Requistion (Item)
    • Post Goods Receipt (Item)
    • Cancel Goods Receipt(Item)
    • Process Invoice Verification (Item)
    • Cancel Purchase Requsition (Item)
    • Change Purchase Requistion - Purchasing Group
    • Change Purchase Requisition
    • Release Purchase Requistion (Item)
    • Create Purchase Order (Item)
    • Change Purchase Order (Item)
    • Delete Purchase Order (Item)
    • Approve Purchase Order
    • Send PO (Header)
      In my understanding these events are always raised and do not need to be activated. You need to set up Process Observer so that the events are consumed (see our blog post about setting up Process Observer). There is also a separate blog post on specifically procure-to-pay: Instrumentation for Procure-to-pay process on item level in Process Observer

  3. Yes, there is an API to directly raise events for Process Observer, it is explained in details in blog post Process Observer (POB) Direct Event API for Logging Processes from SAP and Non-SAP Systems

  4. Yes, you're good to go. Set up Process Observer (activation, see above). Check if the required events are raised (that's standard BOR; use e.g. SWEL to see which events are raised). Map the events to tasks (POC_FACADE) and create a process definition (POC_MODEL). Don't forget to check your process definition (POC_MODEL_CHECK).

Let us know if you have further problems - or how this has helped you.

Best regards & good luck,

Christoph

Former Member
0 Kudos

Hi Expert,

I am trying to intergrate BRFPlus for Classification KPI in Process observer.

I am able to save the BRFPlus OBJECT ID/Rule ID. But while editing, it opens the explorer and give error message as "Object ID 005xxxxxxxx is invalid".

Any help appreciated.

Thanks

Nitin Deshpande

Former Member
0 Kudos

Exactly the same Thing here.

Do you have the Error on E-System or K-System? E-System works for me.

Regards,

Marc

Do you have SAP BW?

bernd_schmitt
Active Participant
0 Kudos


Hi Nitin,

an OSS ticket on component BC-SRV-BR may be helpful here.


As an alternative you may also consider to use the BAdI 'BAdI: Rule-Based Key Performance Indicator Calculations' (POC_MAIN_KPI_CALCULATE). Implement the interface method CLASSIFICATION_KPI_CALCULATE in ABAP.

You find the BAdI here: POC_CUSTOMIZING (transaction) -> Business Add-Ins.

Regards,

Bernd

qurm
Participant
0 Kudos

Hi Nitin and others who may have this problem,

There is a warning message that comes up after you select Create BRF Rule, suggesting you save the config before proceeding.  So you must Save the POC process model before you Edit the BRF Rule for the first time.  Note save in the SAP GUI, not the BRFPlus Web Dynpro.

If you Save after Create, then Edit the BRF Rule you will not get the "Object ... is invalid" message.

There seems to be no way to recover from this, i.e. once you have linked an Activity to  an invalid object all you can do is delete that Activity (or KPI etc) and recreated it with a new BRF Rule.

Regards

Andy

0 Kudos

Hi all,

yes, Andy is correct about that. You need to save in SAP GUI and there is no (reasonable) way to fix the missing save.

Best regards,

Christoph

sfaisol
Discoverer
0 Kudos

Hi Christoph,

Is it possible to provide with step by step guide on how Process Observer works with IDoc? Otherwise if you can point to any existing documentation that covers how Process Observer handles Non-BOR events [(External) Event API] that would be nice also.

Thanks,

shafiul

0 Kudos

Hi Shafiul,


I'm afraid I don't have a guide for IDOC available.

The best information available for working with non-BOR events should be in this blog. Some more rather technical information is available in Process Observer In-Depth Workshop.

Best regards,

Christoph

sfaisol
Discoverer
0 Kudos

Thank you very much Christoph! Those above link helps.

Regards,

Shafiul

Former Member
0 Kudos

Hi Christoph and Bernd,

One of our clients has shown interest in using process observer for one of their requirements. Our client and we have some questions on process observer.

1. What is the SAP roadmap for this tool. Will SAP continue to support this tool

2. What is the impact on performance because of this continuous logging of data

3. The data generated in the process log will be huge. How to manage the huge log data

4. In case of this client, there will be around 20,000 invoices created in an hour. With that    amount  of data, what will be the impact on the performance and how much log data will get generated in the process log.

5. Is there any way to integrate the data from this tool with Solution Manager

Thanks & Regards

Radhika

Former Member
0 Kudos

Hey Radhika,

you are describing some of the Questions that have arisen here as well.

Especially Question #3 we are trying to figure out a way to compact the data while backing it up. Some sort of Aggregation rule (everything older than 3 weeks will be aggregated to a daily rate). Have you given any thoughts regarding something like that?

BR,

Marc

Former Member
0 Kudos

Hi Marc,

We have some ideas on storing the historical log data or even purging the data after a fixed period of time.

But the log might be huge even with two weeks of data depending on the number of events created. thats why, I supported the question 3 above with some sample data in question 4.

Might have to plan for periodic back up of data.

Thanks

Radhika

bernd_schmitt
Active Participant
0 Kudos

Hi Radhika & Marc,

in order to give answer to some of your questions above:

1) The Process Observer tools is supported in the context of the standard support of the SAP Business Suite. IMS support for the tool is in place and will be given provided with the same timelines as the other components of the suite. Beyond this we are shipping dedicated improvements, based on customer co-innovation projects, customer connect, as well as new developments like the S4HANA. We ship the innovations as part of new EhPs as well as SPs and notes for selected existing EhPs. Currently we are providing these innovations for EhP6 and EhP7.

2) The impact of one event on the DB depends on your process definitions, but also on the DB configuration and retention strategy. Worst case (4-byte for characters) may result in entries of 4kB per Events you collect with Process Observer. For more realistic scenarios you may count 2,5-3 kB/Events.

3) We are providing a deletion report/transaction POC_MASS_DELETE that allows you to remove old process instances from your DB. You can specify time intervals and process status to specify what you want to delete/keep. You may in addition keep old process log information in a BI system or HANA system side-by-side.

4) you can make some estimates for the data volume given the information in 2). As for the number of events you calculate processing up to 1 Mio. events per hour should not be a problem, given some preconditions: install latest performance improvements and avoid performance bottlenecks like using the generic (build-in) DRB functionality when processing BOR events. These are however no exact performance numbers yet. We are working on publishing a sizing guide that will give more exact numbers for you.

5) As for integration with BPA please check this article: http://scn.sap.com/docs/DOC-48621 . The integration was tested, but not yet delivered in the standard of SAP Solution Manager. For this requirement, please address directly Stafan Voll who is the development lead and global owner for Business Process Improvement with BP Anallytics in the SAP Solution Manager.

Regards,

Bernd

Former Member
0 Kudos

Hello Christoph,

in my workings with SAP PO I have come across a Problem I need to fix.

I have activated SAP PO on two different Systems within my companies landscape. one being restrictive and one being more open.

The one bein more open works without any Problems. The one being more restrictive does not let me Access my newly created BFRplus Rules within the Webdynpros. I have set up both Systems exactly the same way. Both Systems let me use "Maintain BFRplus Rule for Process Status in Process Orchestration" after clicking the "Display BFRplus Rule" in POC_MODEL for the built in processes. If i create a custom process although the ID is being generated and i find it in the table view the webdynpro is not able to access it.

Can you tell me the process of object id Generation and the accessing of it? Are there known Errors? Is there something else required to create These IDs?

Just let me know if there are more Information needed.

Thank you for your help on this interesting Feature of sap!

Best regards,

Marc

bernd_schmitt
Active Participant
0 Kudos

Hi Marc,

the error message may actually point to different problems. You may first check running transaction SU53, whether you have had authorization issues. If not, we suggest you open an OSS ticket on component BC-SRV-BR.

Regards,

Bernd

Former Member
0 Kudos

Hi Bernd,

thank you for your answer, SU53 did not Show anything but it was still a Problem of R&A.

In my process of documenting processes I have come up to another Problem/requirement.

In a Sales Order Process I only want SO´s from a selected plant.

My Idea was to assign a BRFplus Rule to the "Crea SO" Activity that checks if the plant is the one I want in this SO if true continue; else Exit.

Is this the supposed way to do so? Are there any easier ways?

BR,

Marc

bernd_schmitt
Active Participant
0 Kudos

Hi Marc,

yes, you can implement the BRFplus rule of the 'Task association' in the process definition, as you describe. For the cases where you want the task not to bind, clear the event in the return structure.

As an (easier?) alternative you may also implement the BAdI 'Rule based binding of task to a process activity' (POC_MAIN_TASK_BIND). Implement the interface method in ABAP. Here again, if the task should not bind, just delete it from the change structure CS_TASK_ACT_PROC_IN_BIND.

You find the BAdI here: POC_CUSTOMIZING (transaction) -> Business Add-Ins.

All BRFplus rules in Process Observer have equivalent BAdI-methods that allow alternatively the implementation in ABAP.

Hope this helps you,

Bernd

Former Member
0 Kudos

Hi christoph,

I have a question on retrieving the historical data for process observer. I learnt from the  documentation that the data will be logged in the process observer only from the day the process observer is made active. So, to retrieve the historical data for the process log, there is an appropriate report. Are there any standard reports to retrieve this historical data into the process log directly. If so, can you provide the report names.

Requst your suggestions on the above.

Thanks

Radhika

0 Kudos

Hi Radhika,

the problem is... it's not that simple. The event-based approach of Process Observer (POB) provides a lot of advantages, including real-time capability and a generally more intuitive approach than the extraction-based procedures, but the downside is that accessing historical data, i.e. any data that was created before you switch on Process Observer, is difficult.

Generally bringing in old data must be considered a (small) migration project. The overall procedure consists of two parts: Based on a given process definition, you first need to extract the data and then put it into the log. Let's look at the second part first, so you can understand what kind of data you need (and because it is actually the simpler part): Process Observer allows to send events retroactively, so you can send events "from the past" (the history that you want to put into POB). Technically you can - and should - use the same interface as for the direct event instrumentation, see e.g. the blog entry Direct Event API for Logging Processes from SAP and Non-SAP Systems and detailed information in the workshop document.

So what is left is that you need to translate your historical data into events... which is the difficult part, but which is something you need to do on a case-by-case basis, e.g. your process starts with the creation of a sales order. You can read the sales order and the sales order (like most objects) provides the information when it was created. You can now translate this to task 114-21 (Sales Order - Create) and put that into the interface (adding the ID of the sales order and any other required information).

Obviously, extracting these events retroactively is not an easy task (which is why the POB approach is event-based rather than extraction based), and you will not be able to extract all data. But you may decide that for historical data you need less events, which may need to be reflected in you process definition.

Best regards,

Christoph

bernd_schmitt
Active Participant
0 Kudos

Hi Radhika,
if you send me your email address, I can send you a sample report that converts historic data into Process Observer events. The quality of the report is not 'production quality', but we were mainly using it to generate mass data based on existing business documents in the order-to-cash domain. The report reads timestamps from db tables and change documents, and then calls the direct event interface using the determined timestamps.

Checking how the report works you may be able to adapt it to your use case.

Regards,

Bernd

Former Member
0 Kudos

Thanks Bernd. I sent a seperate mail to your email address.

Former Member
0 Kudos

Hi Christoph,

Our requirement is to identify all the functionalities/features implemented in SAP in the client landscape and find out how many of those are being used by the business. So, we are looking at process observer as a tool ot identify the existing functionalities in the system. For this, we need the historical data to check if the functionalities are executed at aleast once. If at least once, the functionality/feature was used, it means that it is setup for configuration, Otherwise not.

Former Member
0 Kudos

Hi Expert,

I have configured Process Observer for Sales Process (SO Create, SO Change, OD create and PGI)

All events are triggered sucessfully and I can monitor it using POC_MONITOR.

I am facing below 2 issues -

1. Even after completing all steps as per process defination process status stays as "Running" instead of "Finished"

2. Count KPI is not counting as expected, value is zero even after chaning SO multiple times.

Please help with suggestions

Thanks

Nitin Deshpande

Former Member
0 Kudos

Hi,

I have tried with fresh example now process status is "Finished" , I have not done any thing.

But Count KPI is still zero.

PFB Process Defination and Testing.

bernd_schmitt
Active Participant
0 Kudos

Hi Nitin,

happy to see that the process status is now working in your sample. Eventually, the end flag was missing in your first process definition. To avoid these kinds of errors it may be useful to run the Process Definition Checks, i.e. report (or transaction) POC_MODEL_CHECK. This is always a good practice. With a later support package, we have also integrated it into the Process Definition Viewer, so you can start it right from there.

When I look at the count KPI definition, I see that you have defined to count only the SO changes that are done after an outbook delivery was created. In your log sample all changes are done before the delivery was created.

To count all SO changes, you can change the KPI definition in the following way: Clear the determiner and the determiner definition and try again.

Regards,

Bernd

robphelan
Participant
0 Kudos

Hey everyone. I see from early posts that Process Observer relies on DRB to string documents together in an end to end process.

After I created my model and started recording transactions, I handed POC over to my HANA guy for report creation.

He sees the transaction records but has no idea how to link them together. Can you tell me if there is a shared GUID or something that ties all the processes together?

bernd_schmitt
Active Participant
0 Kudos

Hi Robert,

if you go to table level, you find ties between the logged tasks/events in table POC_D_PRE_BA. The relationships between the activities are just a projection out of this.

Instead of creating your own views, you may also check our predelivered calc views in the SAP HANA Live package for EhP4. It comes bundles with HANA and you find the definition of the views here: Process Observer - SAP HANA Live for EHP4 for SAP ERP - SAP Library

Regards,

Bernd

robphelan
Participant
0 Kudos

Hi,

I've created a 2nd process definition... but for some reason, my Start task is showing up twice in the Process Details in POC_MONITOR.

The Starting task can be 1) Purchase Order Creation or 2) Sales Order Creation or 3) Stock Transport Order Creation.

When I'm executing my Order Creation, I'm seeing two entries in the POC_MONITOR for each process.

Any idea what might be causing the Start Activity to show up twice with the same Order Number at the same time?

Thanks,

Robert.

bernd_schmitt
Active Participant
0 Kudos


Hi Robert, if you are using the direct API POC_RAISE_EVENT it may be possible that the BOR event is still active, and also processed, as the link between BOR and facade event is still maintained in POC_BOR. However what reason may be we provided a note that eliminates the duplicates while logging comparing the transaction IDs of the events. The note is 2020327. Depending on your SP level the note may have some dependencies. Regards, Bernd

robphelan
Participant
0 Kudos

Thanks Bernd.. We must be far behind because there are at least 8 Notes that i'll have to apply. Plus, one of the object a note is looking for doesn't exist in our system yet -  CL_POC_METRICS_ENGINE->CALCULATE_SERIES_KPI

Is there any way to upgrade Process Observer individually without a SP upgrade?

We're currently on 7.31 SP10 (ABAP and BASIS).

Message was edited by: Robert Phelan

bernd_schmitt
Active Participant
0 Kudos

Hi Robert, via OSS component CA-EPT-POC you may request for a collective update note for your release and SP level of SAP_BS_FND. This may ease the implementation. But installation will still go through SNOTE. Regards, Bernd

bernd_schmitt
Active Participant
0 Kudos

The one change in customizing that may solve the issue, when your're using the direct event API POC_RAISE_EVENT (and not a BOR event) to create the order, is to look into the customizing settings POC_CUSTOMIZING - General Settings - Check Process Monitoring Events for BOR.  Here you could uncheck the event active flag for BUS2032 CREATED, as throgh this setting it may have been raised zwice: by the BOR event and the direct event API. If you are using (only) the BOR event you should however keep it as active. Regards, BErnd

robphelan
Participant
0 Kudos

Hi everyone, I initially posted this in the Business Process Management area.. but found this may be a better spot for my question..

This is our first Process Observer implementation. I'm using a custom BOR object to track an end to end process.

My Activities begin with the receipt of new or updated data from an external system and ends with a Goods Issue / Receipt Document Create. I manually call a final Activity called "Actualized" in a clean up Method.

  1. Information Rec'd from external application
  2. STO / PO / SO (or any combination) created
  3. Possible Updates to STO / PO / SO
  4. Inbound or Outbound Delivery Document Created
  5. Goods Issue or Receipt Document Created
  6. Final Actualization Cleanup

I've assigned #1 as the Start and #5 as the End of the process.

I've been testing and the POC_MONITOR is picking up the Starting Tasks - great!

However, I have some processes that should have been put in a Completed Status since they've hit their Ending Task.


But, if you notice, it's still in "Running" status.   How do I get this process to "Complete"?

Thanks,

Robert.

bernd_schmitt
Active Participant
0 Kudos

Hi Robert,

your process configuration looks fine to me, and the log shows that the final event was captured. So the process should really go to 'Finished'. I propose you open an OSS ticket to component CA-EPT-POC describing the problem as above, and also indicating the version and SP level of component SAP_BS_FND. You find this in System - Status ... - Component version.

We can then directly look into the system, and provide a fix, if needed.

Regards,

Bernd

robphelan
Participant
0 Kudos

Hello Berndt,

My boss asked me to implement an end to end process at a slightly higher level. He didn't want tracking done at each order level - only at the begin and end of a major process.

So, I scrapped this model and created a new one. Transported it to QA, tested, and am now getting my status = "Finished".

I have no idea what was wrong.

bernd_schmitt
Active Participant
0 Kudos

Hi Robert,

may have been interesting to look at the issue - we haven't seen this issue in our test systems - but this is even better now! Good luck with your further implementation, and let us know if you need further support.

Best Regars,

Bernd

robphelan
Participant
0 Kudos

Well, it might have been interesting to you to look at the issue, but I was pulling, what is left of my hair, out! 

I triple checked the config. And since it was our first Proc. Observer model, I was sure I overlooked something but am happy the issue has resolved itself.

bernd_schmitt
Active Participant
0 Kudos

Can offer you a free haircut for compensation when you are here in Walldorf, we have the hairdresser in house.

bernd_schmitt
Active Participant
0 Kudos

Hi Robert,

maybe you have already seen the that we just started. We'd be happy to get your feedback as well!

Regars,

Bernd

robphelan
Participant
0 Kudos

just completed it a couple of days ago. Thanks!

markus_greutter
Participant
0 Kudos

Hi Experts,

is there a known issue with the version of the process definition?

From the proof of concept phase in the project the process observer was already configured for one small straight forward process. Now I created the complete setup and implemented the final process definition in a new version.

The 'Current Version' of the Process Definition Header is set to '2'. The Active flag is set for version '2'. For version '1' the flag is removed.

When I now execute the process nothing is recorded. Starting transaction 'POC_MONITOR' for the relvant process definition I receive the message "No data found for the entered criteria".

Transaction POC_TASK finally shows that a process was started but for the task which belongs to version '1' and not for the current version '2'.

Are there any further settings to be considered?

Best regards,

Markus

0 Kudos

Hi Markus,

this should work. I suspect you are trying to "continue" with the existing process. The version works like versions in SAP Business Workflow: Processes started will continue with the version they were started with. Now you have de-activated the old version, so they are not continued. Only new processes (version 2) are used.

Alternatively you should check if your current process definition is okay - please run POC_MODEL_CHECK (transaction or report) on your process definition (always do that - I have just tested with the version and had an issue... that could have been identified with POC_MODEL_CHECK).

You should also check the application log (SLG1) for object POC.

If none of this helps... I'll need more information.

Best regards,

Christoph

markus_greutter
Participant
0 Kudos

Hi Christoph,

after it was clear that the versioning works fine and transaction POC_MODEL_CHECK does not show any error I re-checked the BAdI implementations.

Finally I found the issue in the implementation of BAdI POC_INSTR_MAP_EVT_TASK. Small but with big impact!

Thank you for your quick response. It helped me to narrow down the problem.

Best regards,

Markus

Former Member
0 Kudos

Hi Christoph,

I have a problem with Process Observer. It does not recognize the relationship between a sales order and a related delivery although the relationship between these objects is present in the report POC_DRB_BOR_RELSHIP. I can see the delivery in POC_TASK_ORPHAN. There are no problems mentioned in POC_MODEL_CHECK. Have you got an idea where could be the problem?

SAP ECC 6.0 EHP5

SAP_BS_FND 7.02  SP09              

Thanks,
Jens

0 Kudos

Hi Jens,

this should really, really work: It's pretty much the most used demo case... so this does work. If we are talking about the SD sales order in ERP (VA01 etc.) and the related oubound delivery (BOR object LIKP, transaction VL01N)...

Since you can see the task in the orphan report, the event was raised and mapped to the correct task. And it links okay to the sales order if POC_DRB_BOR_RELSHIP shows it.

Potential problems I can think of:

  • You are somehow creating sales order and delivery at the same point in time, so when the data is processed it's not really on the database... (so you are not manually using VA01 & VL01N to create the data) - generally that should work anyway, but that could cause issues... maybe.
  • Check the outbound delivery activity and its associated task - are you sure you're seeing the task you have in the definition as an orphan? Or are you using a "similar" task?
  • Do you have some BAdI active? You can overrule the default predecessor determination using DRB... maybe you have done so (but obviously not very well - maybe you have done something for other objects and that's kicking in now with the delivery doing strange stuff).
  • Something else I can't think of... and I'd need more information.

It's probably a good idea to raise an OSS, we're happy to help.

Best regards,

Christoph

Former Member
0 Kudos

Hi Christoph,

even if a create the delivery hours or days later, it doesn`t work.

Yes, I am sure; if I change the process definition and mark the delivery as a start activity, Process Observer creates a new process instance after using transaction VL01N.

I have no idea about BAdl; never used it before. No one except me uses Process Observer on the System I am working.

Is it possible to use „Map Previous BOR Objects from BOR Payload“ to overrule DRB in this case?

If is it possible, may you tell me how to fill this view? There are no sample entries in my system.

Best regards,

Jens

0 Kudos

Hi Jens,


something is wrong is going on. This really should work. - The idea to mark the outbound delivery as a start object is quite good, but it really supports my statement that it should really work.

The view to map previous BOR object from the BOR payload (it's POC_V_MAPPREVBOR.) won't help here, I think. You can find more on that in the Process Observer In-Depth Workshop on page 57. In this case - besides the fact that DRB should really work - the event LIKP-CREATED (thrown when the outbound delivery is created in transaction VL01N) has nothing (helpful) in its BOR container. When you look at the parameters of event "OutboundDelivery.created" in the Business Object Builder (SWO1) for object LIKP, you see: nothing. This event has nothing in the its payload, so there is nothing to map (well, maybe there is something in the container - the information here is not always complete - but it is not helpful in this case).

Going back to what really should work: DRB. You have stated that POC_DRB_BOR_RELSHIP does show the sales order. Please really check that!

One more thing that you should check is table POC_D_PRE_BA. In POC_MONITOR check the process ID (you need to add that to the list of visible columns in the ALV on the search page of POC_MONITOR) of the process that has the outbound delivery created when you made the creation of the outbound delivery a start activity. In SE16 for table POC_D_PRE_BA check if for that process_id you there is an entry. The entry of the predecessor should be there, even if the predecessor does not exist on the database (this is used for local federation). - Alternatively you can use the task ID of the outbound delivery task using POC_DISPLAY_BA. I am not sure what to expect, but it would be helpful to know if there is an entry.

Other than that... I'm pretty lost. My next step would be to put a break-point in CL_POC_BOR_EVENT=>GET_PREVIOUS_BOS_DRB where CALL FUNCTION 'DRB_NEIGHBOURS_GET' is called and see if lt_previous_bor_objects is filled as expected. I suspect that this method is never called because of something strange (BAdI-implementation) is going on.

If this is all fine, I would check in BOR_EVENT_HANDLER at the end, right when CALL METHOD lo_poc_ba_observer->raise_event is executed and inspect lt_ba-PRE_BO_DATA if that is filled as expected.

If this is all well, then things get a bit more complicated.

To execute the break-points, you will need to un-schedule the report in SM37 (POCR_MAIN_QUEUE) and execute rswfevtpoqueue in dialog. This should be done only in a test enviroment, since it may lead to issues with direct events that should be processed in proper order - events may thus be ignored or added to the log in the wrong order.

I think we're at a point where you should raise an OSS - or at least talk more directly since finding this issue may become a tedious and lengthy issue if we have talk solve the problem by you checking the data etc.


Best regards,

Christoph

kevin_wilson2
Contributor
0 Kudos

Christoph,

I have enabled POB on our ECC instance and the simulation events work great. Only took me about 2 hours to run through the end-to-end installation so that was not a problem. Now that I have data in the POB tables showing up in the monitor I have a few questions for clarification:

  1. Where does the correlation between 1 object and another related object take place?
    1. E.g. Sales order = 10001 - BOR object = BUS2032 and BOR Event = CREATED
    2. Delivery = 80001 and relates to 10001 - BOR object = LIKP and BOR Event = CREATED
    3. When POB receives LIKP for 80001 it does not receive any data telling you that it is related to BUS2032 10001
  2. What standard functionality / content is delivered in order to tap in to the POB data to display analytics. All the demos / documents show nice graphics but the standard software delivered with ECC is the raw data with no nice functionality around it. What, in addition to POB, do I need to install in order to leverage the data stored in POB for the purposes of analytical processing?
  3. Does POB store the BOR Event Parameter values anywhere for processing?
    1. E.g. In BUS2032 - I could create a new custom BOR event called DELIVERY_BLOCK_ON and I could pass the value A1 in parameter BLOCK. If the value A1 stored anywhere?
    2. Ultimately we would also have a DELIVERY_BLOCK_OFF event with A1 as a parameter once again which should allow me to close off the duration between BLOCK_ON and BLOCK_OFF for that specific delivery block.
    3. Is this type of functionality possible?

Thank you

Kevin

kevin_wilson2
Contributor
0 Kudos

Also Christoph,

I noticed in the one presentation that in POC_MONITOR you can click on a button to view the "Instance Diagram" for the process. My view does not have that button and I'm viewing the POC Simulation model 3 definition. Is this something that needs to be enabled with a user parameter?

Thank you

Kevin

0 Kudos

Hi Kevin,

a lot of questions. But I see that you are asking the right questions... I will give some quick answers hoping to point you in the right direction.

  1. That's our magic! Well, we're relying on some functionality called Document Relationship Browser (DRB), which is originally implemented to help in the area of archiving. Some business objects, particularly in SD, "know" their neighbors. We have delivered a little tool, report POC_DRB_BOR_RELSHIP that will show you which neighbors Process Observer (POB) will recognize. DRB is limited, but it is very convenient (as you have seen). Note that DRB is not the fastest way to find these relationships... so if you find you have a performance issue, you may wish to reconsider implementing the appropriate BAdI.
    Mind the format of your IDs when using our tool, there is no (alpha-)conversion in place. And some of the IDs coming out of DRB may be post-fixed with some additional "crap". POB will generally take care of that, though.
  2. Right: POB does not deliver analytics in the ERP stack. For an extended discussion of analytical use cases and how to do it see. The short version: You can't do analytics with only POB, you either need to build your own thing or use BW (using our delivered BW content, including extraction) or - as a prefered option - you want to use .
  3. No. The data is lost. Gone. For the obvious reason to save some space on your probably already hard-working database. You can, in theory, extend the POB tables and use some BAdI to store the data - this is an option that we we have considered (the additional parameters can be moved through all of process observer at runtime) and you should be able to get this to work, a full explanation of pretty much everything you can do on a detailed technical level can be found in our technical workshop.
    But from your additional questions I can see that you probably want to do something else: If you want to see the (e.g. delivery) block in your process log, you should have two activities like set delivery block and remove delivery block. Your process definition should reflect your business requirement. Do not add this information "somewhere" since it will be difficult to access and use in further analytics. To get separate events, there are different approaches. You can do it in one of the BAdIs, of course, but for the sales order you have change documents. You can get change documents to create separate BOR events (see our technical workshop, hint: transaction SWEC) or you can use our more-or-less brand-new (a year old, so if you have a recent SP...) functionality to directly generate POB-events based on change documents (IMG: Process Observer -> General Settings -> Facade Layer Content -> Set Up Change Document); this latter function absolutely deserves a blog entry featuring it a bit more since we see this quite often.

All of the above links can also be found from on our home page.

I hope this helps a bit. You may want to come back asking for more detail, my answers can only be a starting point.

Best regards,

Christoph

0 Kudos

Hi Kevin,

with regards to the Instance Diagram: no, it is not a user parameter. It is a more recent support package, I'm afraid. If you don't see the button, it is not there. And I'm afraid we have no note for that, since it involves a lot of UI stuff (above and beyond the button) that would make it quite complex to apply this.

The button is there for process instances and for process definitions, so in POC_MONITOR and POC_VIEWER. We also have a side panel that can be used in Business Client with any object - so if you have a sales order that is in some process log, the log is shown next to the sales order (e.g. VA02, VA03). There are also side panels showing KPIs for the related process.

And while I cannot promise anything, we are thinking very hard about improving this visualization since we have had some feedback on this. There are currently limitations on visualizing federated processes and generally what data you are shown that I would like to overcome. When we release something, I will blog about it.

Best regards,

Christoph

kevin_wilson2
Contributor
0 Kudos

Thanks Christoph,

The report POC_DRB_BOR_RELSHIP is a little confusing. If I put in BUS2032 as the BO type I would expect to see relationships with Quatations and Deliveries as an example. Instead it comes out with Sales Order! Am I missing something here?

Thanks

Kevin

bernd_schmitt
Active Participant
0 Kudos

Hi Kevin,

please enter BUS2032 as BO type and a BO ID that exists in your system incl. leading 0s, like 0000001000. In our test system the result then comes up with IDs for related BUS2031, LIKP and IDOC.

Regards,

Bernd

kevin_wilson2
Contributor
0 Kudos

Thanks Bernd, that did the trick. I see that it is pretty much like document flow. Where can I find documentation around just which object are related to each other.

I noticed that none were at the item level - e.g. VBAP as an object itself, is related to LIPS the delivery line. Does this imply that process observer cannot monitor an order to cash scenario at the item level? E.g.

  • SO 123 item 10 relates to Delivery 234 item 10
  • SO 123 item 20 relates to Delivery 235 item 10 (Separate delivery for different line)
  • SO 123 item 20 relates to Delivery 236 item 10 (2 deliveries for same line - schedule line)
  • Each of the lines and each of the deliveries could have individually different statuses
  • Can POB monitor such complex scenarios? If so what would be the technique to adopt to achieve this visibility?

Thanks

Kevin

0 Kudos

Hi Kevin,

seems we're taking turns on you... I'll try this time.

Process Observer can support these scenarios. In short, the challenge with these scenarios is instrumentation, i.e. you won't have the right events on item basis. And the aforementioned document relationship browser (DRB) won't work on item level (since you'll archive on object level rather than item level). To handle these scenarios, you first need to decide whether you want to monitor on item level or on header level; generally this will be governed by how the system is used - do you work on item or on header level? If you find that you are working on item level, you will model and monitor the item level events - but you have to rely on either change documents or put your own events in the code. Process Observer will then store not only object IDs, but also item IDs if you provide the information (also for the successor). You will find that the relevant tables and interfaces have an "item" field to support that (essentially the ID is not object ID, but object ID + item no.). A third level is (out of the box) not supported (e.g. schedule lines), though.

A more extended explanation and example can be found in the blog entry Instrumentation for Procure-to-pay process on item level in Process Observer, which is about procurement, but it works in a very similar manner for the order-to-cash process.

Best regards,

Christoph

Former Member
0 Kudos

Hi Experts,

I have the following questions about Process Observer :

1. I modelled a new proces , but when checking the process definition i get multiple errors :

No BOR event of class assigned to any tak

I m sure that i assign for every activity a task with bor :

2. The second question : i m looking for step by step tutorial for pushing the Process Observer log to SAP ESP or/and HANA.

Thanks in Advance,

Kind Regards

0 Kudos

Hi Moo,

thanks for you interest in Process Observer. I hope I can help you a bit here.

With regards to your first question, you are right, you probably have assigned a task to every activity - if not there would be a different message, which you have probably seen before. "No BOR event of class assigned to any task" indicates a different situation, that is likely to be a problem (but doesn't have to be). The problem here is in view cluster POC_BOR, where you assign a BOR event to the task. The tasks are modeling artefacts that have to be mapped to something "real". This mapping should be against a BOR event (or something else). If no task assigned to an activity (in your case task 114-21) is assigned a BOR event, the activity will never be logged. Except if the task is raised against the direct API or the generic BOR event or something else that you would be aware of (you have to do that). So the message indicates that something must been set up for this to work; if so, you can disregard the message (it won't go away, may it would be a good idea to give the user the option to mark this as "not really a problem"). - I don't know which system you are doing this in, but in an ERP system 114-21 should be mapped against a BOR event, i.e. you should find something like

in view cluster POC_BOR -> Map BOR Event to Task (provided you have the right SP, but this was delivered pretty early on). If not, let me know... it really should be there.

Your second question is certainly more difficult to answer. Let me try, though. Process Observer data is stored in simple tables, so replicating to HANA follows standard procedures, e.g. through SAP LT replication server (formerly often called "SLT") and general information can be found at http://scn.sap.com/community/replication-server. For the list of relevant tables, see note 1852213. For ESP, it is probably easy to find a generic link, I have no specific experience with that; but I do not understand what you would want to do here, I would need to understand the use case better. The same is true for the information given with regards to SAP LT: The tables you need depend on your use case, the set given above is what you would "commonly" want (even if you don't use them in a HANA Live context). You will also want to repliate application tables (e.g. VBAP, if you want to play with sales order related processes). If you are using Process Observer with SAP HANA Live, you may want to look at the views provided in the Virtual Data Model. But the set of tables given in the note should provide a good starting point if you just want to play with this.

Let me know if you have further questions and/or want to discuss your specific use case.

Best regards,

Christoph

Former Member
0 Kudos

Hi Christoph,

Thanks for the info , very interesting

I dont have a use case right now i just try to play with process observer in combination with SAP hana/Opint.

I searched for the note 1852213  i get the error

: Document not released. could you help?

Thanks & Regards

0 Kudos

Hi Moo,

right, note 1852213 is not released, I'm sorry about that.

Anyway, the required tables have been added as attachment "Tables_SAP_HBA_ECCFND701.txt" of note 1849168 (as pointed out on the overview page). The list of tables mentioned there is:

POC_C_KPI_COUNT

POC_C_KPI_CTG  

POC_C_KPI_CTGD_T

POC_C_KPI_DURA 

POC_C_PRC_KPI  

POC_C_PRC_KPI_T

POC_C_PRC_STAT_T

POC_C_PROCESS  

POC_C_PROCESS_T

POC_C_PS_BND   

POC_C_PS_T     

POC_D_BA_LOG   

POC_D_BA_SRCSYS

POC_D_KPI_ACT  

POC_D_KPI_LOG  

POC_D_PRC_BIND 

POC_D_PRC_IN   

POC_D_PRC_STP_IN

POC_D_PRE_BA   

POC_D_TRACK_THR

POC_I_BA       

POC_I_BA_T     

POC_I_BA_TYPE_T

POC_I_BO_TYPE_T

POC_I_BUS_AREA_T

POC_I_CBE_T

The note is more likely to hold the correct and full list of tables since that may (or may not) change over time. For now the list is fine, if you look at this much later, you may want to check the note.

I'm interested to hear back from you what you did with Process Observer and HANA.

Best regards,

Christoph

keohanster
Active Contributor
0 Kudos

Hi Christoph,

For someone coming from an SAP Workflow environment, I am very familiar with the power of events - and raising BOR events is easy.   So this looks like it will be a great project.

Question: In the Workflow World, we know that certain changes do not get reflected immediately, and so we sometimes have to 'refresh the org buffers' (swu_obuf).  This comes in very handy in Dev, where you are often clicking on this, saving, clicking on that, saving, etc.  Just resync the buffers and all your changes are enabled.

Are changes made to, say, a Process Definition immediately taken into consideration - or only acted upon the next time you run Process Monitor?

Thanks,
Sue

0 Kudos

Hi Sue,


thanks for your question. A SAP Business Workflow background surely provides a good starting point for working with Process Observer - we're using some Workflow technology, terminology and thinking. And while this is quite helpful, "deep Workflow knowledge" is certainly no pre-requisite for working on Process Observer.

To answer your question: Process Monitor is purely a tool for visualization and while some data is cached almost all changes of process definitions should be reflected immediately (with the exception of some texts). For good measure you can re-start the process monitor (POC_MONITOR); there are no additional buffers to be reset.

There is some buffering going on with the process definitions, though. When saving the process definition the active BOR events of all process definitions are determined and put into a separate database table that holds the list of "active" events, i.e. events that are relevant for Process Observer; table BOR Events Relevant for Process Monitoring (POC_V_BOR_EVT) can be viewed in IMG activity Check Process Monitoring Events for Bus. Object Repository. If you change your facade layer content, an update of this table may be necessary - so go into the process definition, change something, save and change back to trigger an update of this buffer. This triggers report POCR_UPDATE_BOR_EVENT, which actually carries out the update - and you could just run this report.


Finally, you should look into version handling when changing process definitions - significant changes to process definitions may cause unexpected results in your process logs. But I guess this should be discussed separately... I may need to think about a new blog post.

Best regards,
Christoph