Skip to Content

SAP MII & ESP Integration Guide (Technical)

Overview

I am often asked how the SAP Event Stream Processing (ESP) engine and the SAP Manufacturing Integration & Intelligence (MII) applications can be used together to provide value for various customers and different use-cases that they have.  I did write a BLOG about this topic from a business perspective here (http://scn.sap.com/community/manufacturing/mii/blog/2014/09/02/streaming-data-in-the-sap-industrial-space-miipcohanaesp) that highlights how to leverage the integration between these products to drive maximum value and coverage of the various use-cases.  As you probably know there are many deployment options for both the SAP MII and ESP products and more recently the deployment on a central HANA database is now an option for both of these products.  Each product has something that it's really good at and the below integration highlights the strengths and weaknesses of the products and shows how they complement each other in a joint landscape.

What SAP Event Stream Processor (ESP)

In short ESP is really good at inline data analytics so as the data arrives to the stream it's waveform (sliding window typically) is instantly correlated with other various waveforms to identify patterns in the data.  It can also stream this data, in tremendously large volumes, to HANA database tables for longer-term predictive and operational analytics.  It has it's origins in the financial sector for high-frequency trading and trading analytics but we believe that there is value in this engine for driving in-line process analytics and controls visibility to operations folks for live/real-time process improvements (See BLOG http://scn.sap.com/community/manufacturing/mii/blog/2014/09/02/streaming-data-in-the-sap-industrial-space-miipcohanaesp).

What is SAP Manufacturing Integration & Intelligence (MII)

The SAP MII product is very good at providing local operations KPI views with SAP ERP context around them and around the data and it also has a product called SAP Plant Connectivity (PCo) bundled along with it that interfaces directly to the various Operations Technology (OT) systems.  Combining these together you have a very powerful ERP centric view of your live and historical operations data for driving live performance management and KPI data to improve processes in real-time.  This view of data can be pushed to ERP to drive live insight in ERP on what is happening at various plants and locations or to local operator dashboards and screens to further drive operations visibility and capture additional context around events from operators.  This type of data can then be loaded into the local MES or central ERP/HANA instances or both in a seamless, to the systems & workers, manner to ensure consistency of data and information.

Architecture

The SAP MII, PCo, ESP, and HANA products all have various technical features that can be jointly leveraged to drive local and central OT data but with ERP context so that the data can be correlated in-line and also after the fact once it's stored in the HANA database tables.  Having context to various event points also enables ad-hoc analysis of the detected events to drive operations improvements across multiple systems simultaneously.  The most common architectural scenarios for integration MII/PCo and ESP together is to stream live data from PCo directly to ESP in high-volumes and then to also include lower volume data (such as data from an operator or an MES/ERP system) into the stream in parallel to the OT data.  Then using the context provided by MII/PCo to the data to feed the "Enriched Data Stream" of ESP you have the ability to correlate the data across streaming inputs as it happens.  Conceptually the interface looks like this:

What happens is that the live operations data needs to be correlated with live execution and enterprise information to have the proper identification fields but also needs to know the additional conditions that the assets are operating in to determine the proper course of action, if any, that needs to be raised as an event and pushed to MII for reporting or process improvement recommendations or to HANA for the predictive analytics and multi-asset/distributed asset comparisons.  This is all done agnostically of the underlying historian/SCADA/DCS/HMI vendor because it's using the SAP MII and PCo products to achieve this.  On a more technical level the following diagram shows the details behind the various interfaces between the different SAP products and how they are used:

There are many ways to leverage the above interfaces to drive process improvements and visibility to live operations data in-line with the acquisition or after the fact in a larger analytical HANA model across huge time frames and assets.  Again this is all possible because of technology improvements to the SAP MII product for mapping in ERP data to the various tag sensor data points using the Plant Information Catalog (PIC) and would continue to improve with the "Remote Administration & Configuration of PCo" topic that is currently in the customer Co-Innovation process but is planned to support pushing meta-data from the PIC to the notification streams of the PCo engine to address specifically the requirement of BuCM that is shown in the above diagram.

Something Important to Keep in Mind

When people ask me about this scenario I am always very pleased to hear that they have given a lot of thought and are also very eager to implement this  scenario to bolster their existing analytics capabilities.  I am also confronted with a common question around reliability of the data and how accurate the streaming engine is for loading data to HANA.  We are aware of the fact that sometimes in the SCADA, DCS, or Tiered Historian layers data gets buffered so that the "Current' values do not update until the buffered data moves through the network but only then if your hooked to a historian will the most recent, or Current, data value be updated and that is what will appear in the stream.  So this streaming feed behaves more like a video feed rather than a guaranteed delivery/replication mechanism of the historian data.  There are ways that are built into the ESP engine to detect the flat-lining of tag data which in ESP Terms is called "Data Aging".  This is a common scenario in the financials sector as well and why it's already handled by ESP to detect this occurs.  From here ESP can send a "Data Retrieval" style message to MII which can then synchronously query the Historian Event Table for all of the events missed during the detected flat-lining or data aging time period and then load the HANA tables with the missing data.  The scenario looks like the below in case you are a more visual person:

The above scenario is a very common one to come across in the real-world especially when working with a large number of distributed assets but it's important to know that this scenario only needs to be configured for your particular business rules and not implemented as a custom development project.

Configuration of SAP ESP

In order to setup and use the above scenario a stream interface in the SAP ESP product has to first be defined to accept the high-volume data from SAP PCo and also another lower volume MII interface so that MII can push MES/ERP/Operator data into the streams.  For now we will stick to the PCo to ESP interface as the MII interface to ESP is covered here already (SAP MII & Sybase ESP Publish Actions | SCN).  In the SAP ESP environment to define a stream first open the ESP Studio (additional details and help are available here: SyBooks Online) connect/login to your ESP server and create a new project and workspace using the SAP Sybase ESP Authoring perspective you can create very complex and interactive streaming and analytical flows like this:

The above flow enables the ESP engine to sort out various streaming data feeds across multiple PCo notification streams, correlates the enriched streams together, and then generates a severity and notification payload for MII as the data is streamed in from the tags (100ms per data point per tag).  Then the notifications are fed to the MII Messaging Services interface (HTTP) and stored in the MII KPI Framework so that event frequencies can be stored and reported on as they happen to give operators insight into the process.  This is the foundation for Short Interval Control (SIC).  The Post to MII operation in the above ESP Stream definition simply authenticates (Using Basic Authentication) and posts an XML message to the MII Message Listener, Message Listeners - Message Services - SAP Library, using this URL:

http://nvpal771.pal.sap.corp:50000/XMII/Illuminator?service=WSMessageListener&mode=WSMessageListenerServer&Session=false…

and is done when any "event" with an assigned severity is detected by the ESP engine across the streams as the live data flows into the engine.  Here are some example snippets of how to handle and manage the PCo data:

/** Write PCo events to a file for replay purposes -- starts manually */
ATTACH OUTPUT ADAPTER oaPCoEvents TYPE toolkit_file_csv_output TO isPCo GROUP asgNoStart
PROPERTIES dir = 'D:/SAP/ESP/workspace/pco1/data' ,
file = 'isPcoSimulator' ,
csvPrependStreamNameOpcode = TRUE ,
csvHasHeader = TRUE ;

/* Simulator input stream to test input without running PCo */
/*
ATTACH INPUT ADAPTER isPCoSimulator TYPE toolkit_file_csv_input TO isPCo GROUP asgNoStart
PROPERTIES csvExpectStreamNameOpcode = TRUE ,
dir = 'C:/ESP5.1SP04/workspace/pco1/data' ,
file = 'isPcoSimulatorSmall.csv' ,
csvHasHeader = TRUE ;
*/

/** Send XML alert to MII MessageService using HTTP POST */
ATTACH OUTPUT ADAPTER POST2MII TYPE toolkit_http_output TO Alerts2MIIXML PROPERTIES
bodyCharset = 'UTF-8' ,
retryNumber = 1 ,
bodyColumn = 1 ,
requestUrl = miiMessagePostUrl ;
ATTACH OUTPUT ADAPTER Tags2HANA TYPE hana_out TO isPCo PROPERTIES service = 'MIIServiceHANA' ,
sourceSchema = 'RTES' ,
table = 'TAGDATA' ,
outputBase = FALSE ,
dataWarehouseMode = 'INSERTONLY' ,
timestampColumnName = 'HANATimestamp' ,
maxReconnectAttempts = 5 ,
maxQueueSize = 32768 ;
ATTACH OUTPUT ADAPTER HANA_Output1 TYPE hana_out TO Alerts2MII PROPERTIES service = 'MIIServiceHANA' ,
sourceSchema = 'RTES' ,
table = 'MIIALERTS' ,
dataWarehouseMode = 'INSERTONLY' ,
timestampColumnName = 'AlertTimestamp' ;
ADAPTER START GROUPS asgNoStart NOSTART;

Configuration of SAP Plant Connectivity (PCo)

Once a stream like the one above is completed or at least the inputs to the stream are defined you can define an ESP Destination in the SAP Plant Connectivity (PCo) product; also outlined in the help documentation here Sybase ESP Destination - SAP Plant Connectivity - SAP Library.  This destination can then be used and re-used by any number of notifications defined per agent instance inside of PCo.  Here is a screen capture of how to set this up and correlate the entries between the ESP and PCo Destination screens:

Once you think that you have it setup properly you can press the "Test Connection" button and if everything is deployed and running on the ESP server then the results should look like this:

The next step is to configure what tag data and how frequently it will be pushed into the ESP Destination stream.  To do this setup a PCo agent against your tag source system, define your subscription item tag for the agent, and then create a notification definition.  For the trigger logic I defaulted this to Always so that anytime the tag value changes the new value is sent to the ESP Stream.  Then in the payload of the notification which is defined in the Output tab I defined some meta-data about the stream (Later this will be managed by the Remote Administration & Configuration of PCo feature) as shown below to enhance the context of the streaming data:

Finally, the tag data, reading timestamp, quality secondary data, and the context defined in the Output tab have to be mapped to the input of the ESP Stream interface and this is done in the Destinations tab and in my example looks like this:


Once the agent is started data will begin to flow to the ESP Stream (visible in the ESP Studio) and the streams will also send raw data to the specified HANA DB tables and the detected events to the SAP MII KPI Engine for driving operations reports.  The next step is to verify that the streaming data is in fact making into the right ESP stream and to do you open up the ESP Studio, connect to the project and open up the stream viewer and verify new data is continuously appearing in the stream:

Once you have verified that the data is in fact coming into the stream it's possible to then verify that the ESP Stream is sending the raw data to the HANA tables and the detected event data to the MII Messaging Services layer.

Configuring MII to Accept ESP Event Notifications

The SAP MII Messaging Services layer was used here for message monitoring purposes but you could just as easily called the MII Transaction Runner Servlet (Transaction Calls Using URLs - Content Development - SAP Library) in place of the messaging services.  The Messaging Services layer does however have a nice monitoring and management layer which helps with the demonstration of the flow of the live notification data from ESP.  First you need to setup a message processing rule to handle the XML payload and setup how the messages will be processed but it's easy to do.  From the MII Menu -> Messaging Services -> Message Listener we have already imported and configured the expected XML Payload from the ESP engine in order to automatically identify what the message type and id is for MII to handle:

As you can see we have mapped the message Name and ID fields based on the contents of the XML payload and the message name is what is used to classify how the message is process by the Message Processing Rules and the ID is used to guarantee uniqueness of the processing and to assign a unique identifier to the detected event from ESP.  The associated Message Processing Rule was setup to send all messages with the name of "ALL_HIGH_TAG_VALUES" meaning there are lots of upper threshold violations and need to be handled with additional logic and then below it was a generic rule that processes all of the other messages in the same way to the MII KPI Engine to persist and report on their frequency over the specific time KPI interval.

The transaction used to handle the incoming ESP messages is very simple and it looks like this as it's only job is to take the ESP XML Message and load it into the KPI Object:

The KPI Object definition looks like this and is used to identify possible problems with a given process for enablement of short-interval control (SIC) and other Kaizen initiatives:

Once the data is loaded into the KPI object it is possible to visualize this on a web page and slice and dice the event occurrences based on the dimensions of the KPI (interactive chart).  Here is an example of what this looks like:

This provides a great way to single out areas of concern by looking at event counts by "severity and type" of the events compared to what is expected to be there.  This could also be split across operational data dimensions, such as a material code.

Conclusion

I hope that the above technical architectural reference diagram and example scenario show how to implement the technical scenario of integrating the ESP/MII/PCo/HANA products together can provide a lot of value to local and enterprise people for short-interval control and for long term multi-asset analytics on both health and operations performance.

If you have any questions or would like further details on any point raised in this Document please let me know via the Comments section below.

Thanks,
Sam

Tags: