Skip to Content

SAP Spend Performance Management 3.0 - Implementation Insights



SAP Spend Performance Management differs a little from other procurement offerings from SAP in the sense that it requires good understanding of both procurement domain as well as technical understanding of SAP BI/BW. Furthermore, one needs to be attentive to details during implementation to avoid some issues, which can otherwise consume lot of project time.

This document contains few insights from a recent implementation of SAP Spend Performance Management (SPM) 3.0 which I was part of. There are many well written articles and blogs available on how to setup SAP SPM. This document intends to complement them by highlighting few key aspects one should bear in mind during the implementation cycle to ensure a smooth delivery.

Implementation Insights

Project Preparation

I1. How is hardware sizing done?

A proper hardware sizing is important in order to avoid memory related issues later. SAP Note 1253768 "Spend Performance Management - Hardware Sizing Guidelines" provides the details necessary to conduct this assessment using SAP QuickSizer tool. One should encourage customer team to carry out this activity and the implementation partner should only provide guidance on the same.

Since SAP SPM is deployed on the BI/BW server, it is important to ensure a proper sizing so that it does not impact the regular operations of BI/BW. It is recommended to apply multiplication factor of 1.5 or 2.0 to the SAPS arrived by SAP QuickSizer.

I2. What are the software components required for SAP SPM 3.0?

It is critical to have the correct software components before we start implementing SAP SPM 3.0. One should choose from the three alternative technical combinations provided in the Release Information Note (RIN) for SAP SPM 3.0 - SAP Note 1563216. In addition to this, the basis team should also review the following SAP Notes:

  • 153967 - BI Content Release Strategy
  • 1672336 - Clarification of SPM Prerequisites

It should be noted that in case incorrect technical combination is selected, its effects may not be evident immediately but during later stages of the project. Hence it is very important to select the right option at the start.

I3. Is SAP Business Objects required to implement SAP SPM?

Since the product is labeled as SAP Business Objects Spend Performance Management, many are confused if the SAP Business Objects (BOBJ) stack is needed to implement this solution. It is not required to have SAP BOBJ, this application is deployed on SAP BW.

Business Blueprint

I4. How is SAP Spend Performance Management different than other SAP BW reports?

This is an expectation which needs to be set with the customer's business team. Usually they would be using SAP BW reports and would be inclined to replace some of those reports with SAP SPM reports, which would be incorrect. It is important to highlight that SAP SPM is a strategic reporting tool while SAP BW is used more for operational reporting. This means that in SAP SPM, we start with aggregates, identify the problem area and then do a drill down into the specific range to identify the root cause.

In view of this, one should bear in mind that although SAP SPM provides a 'detailed report' feature, it should not be used for large date range since it is resource intensive and may lead to system leading to error due to excess load. Secondly, it is not recommended to load data into SAP SPM system on a daily basis. It may be recommended to load data every week or two weeks since the idea here is to note the pattern and not the exact transactions.

To explain this use case with an example, say a CPO wants to find out how to reduce off-contract spend. He/she can pull up the 'Off-Contract Spend by Category' report and then identify top 5 categories contributing to this. Then he/she can add dimensions like purchase organizations and/or departments to drill down further to identify the problem area and think of corrective actions.


I5. Are the basis tasks complex?

With SAP SPM, basis activities are relatively complex. Hence if the project team does not have a strong basis colleague, it is important to onboard an experienced basis colleague atleast for review purposes. We had faced lot of issues due to some basis activities either missed out or incorrectly done, most of which surfaced during testing of the solution - a phase where correcting these mistakes are usually costly!

One can refer to the basis setup steps in the Installation and Configuration Guide available at It is very important that all steps in this guide are followed to the letter. Infact it may also be a good idea and a worthwhile investment to spend an extra day to review these steps after the basis colleague has completed them. In case some step is unclear or one feels that certain section in the guide is outdated, it may be a good idea to seek help from SAP Support team as well.

I6. Hints on creating transport requests to capture extractor generation in source OLTP system

When you generate the extractors in source OLTP system (SAP ERP in our case), these changes are captured under a transport request which can be moved to subsequent landscapes. Alternative to this is to generate these extractors in quality assurance and production landscapes again, something which many customers may not be comfortable with.

If we take the transport request route, it is common to expect some of them resulting into error while importing into the target system due to technical object dependencies. First of all, the customer should be made to expect such errors, else they become nervous. Secondly, I am sharing below the strategy we developed to minimize occurrence of these errors which one may find helpful. Below table shows the sample objects which are created during extractor generation and needs to be transported.

  1. Once you generate the extractor, the following elements are created. Please note that ERP1 is the project name and BUYER is the extractor in question:
    1. Function Pool - SAPLZ_SAERP1BUYER
    2. Function Module - Z_SAERP1BUYER
    3. Extractor FM - Z_SAERP1BUYER_DS
    4. Data Source - Z_SADSERP1BUYER
    5. Extractor Structure - ZEXTR_ERP1BUYER
  2. Activate all customer namespace domains in the extractor structure and store them in transport request TR1
  3. Activate all customer namespace data elements in the extractor structure and store them in transport request TR2
  4. Activate all extractor structures and store them in transport request TR3
  5. Activate all data sources and store them in transport request TR4
  6. Store other objects (Function Pool, Function Module and Extractor FM) under transport request TR5
  7. Import transport requests TR1 to TR5 in the target system in the same sequence. If done correctly, the transport request related errors should not arise.

Function PoolFunction ModuleExtractor FMData SourceExtractor Structure

I7. What roles should be assigned to consultant users for project implementation?

Ideally, one should ask for SAP_ALL or any equally powerful role for consultant users during implementation. But quite often, that may not be possible due to audit issues. If that is the case, one can start by asking for developer role and BI/BW role for both SAP SPM and the source OLTP system and then keep adding authorizations incrementally as you hit a road block. Unfortunately, there is no role provided by SAP for this purpose.

However, it is important to note that certain configuration steps would require to execute the ABAP program SSA_HELPER_PROGRAM. This should be done by a user who has SAP_ALL access. This step creates the required SPM objects in background and in case the required authorization is not available, it does not indicate an error and simply does not create those objects. We missed to do this and later had to spend lot of time to identify what went wrong!

I8. Should all extractors be generated in the source OLTP system?

It is important to note that one should not generate extractors in source OLTP system (SAP ERP in our case) which are not required or not in project scope. There are few extractors related to currency conversions, exchange rates, etc. (TCURR, TCURF, etc.) which may already be in the SAP BW system and would be updated in regular intervals. Hence the SAP SPM application can use the same instead of generating these extractors again.

It may also be useful to note here that though SAP SPM creates its own objects and does not conflict with corresponding object in SAP BW (for example, purchase order object for SAP SPM is different than the corresponding object for SAP BW), when it comes to currency conversions and exchange rates, both SAP SPM and SAP BW uses the same objects.

I9. How to plan for data loading activity?

Data loading is an important activity. One should bear in mind the following two aspects:

  1. Volume of data loaded - It is recommended to load around 5 days of data in development environment, around 2 months in quality assurance and whatever timeline agreed with the customer in production. These numbers can definitely change but it should be noted that firstly, data loading is a time intensive activity (more time intensive than normal BW uploads) and secondly, development and quality assurance servers may not have the specifications to handle large volume SPM data loads. In our case, with an above average production server, 3 company codes, 2 years volume of data and round the clock data loading activity, it took us about 14 days to complete this task.
  2. Volume of production data - When you load data from source OLTP system to SAP SPM system in the production environment, it is important to have prior knowledge of number of records to be pushed for specific time intervals (say 15 days or a month). This will help you to plan the date range of data loads better, else the BW server may give in due to unexpected work load. If you are considering to push 2 years worth of data, recording these numbers for all major data sources for 15/30 day interval is a time consuming job. Hence it is better to start this activity in advance while development or quality assurance systems are being configured. For this, you need to request access to customer's source OLTP production system in advance as well as ensure that the transport request pertaining to generated data sources are moved to production early on so that you can use tcode RSA3 (in our case SAP ERP) to note number of records. We did not plan for it earlier and later had to introduce some delay in project timeline to complete this activity.

I10. Can the standard roles SAP SPM provides be used as is?

SAP SPM 3.0 provides a variety of roles for different functions. We noted that the customer particularly had an issue with the end user role since it had access to 'Analysis Administration' link on the portal. This was not desired as it would allow the end user to modify dimensions and measures for the application. If you face such a situation, it is recommended to reach out to SAP Support to fix this. Though they have fixed it earlier for other customers, to my knowledge a SAP Note for that does not exist yet.

I11. There are some custom reports created in SPM development portal and you would like to transport it to quality assurance landscape

This can be done in SAP BW ABAP system in two steps:

Exporting content from development -

  1. Run transaction ‘SE38’ to execute program ‘/POA/R_PST_TRS_CUST_CTN’.
  2. Partition Namespace: Change from /POA to /SPM
  3. Partition Sub Namespace: Leave Blank
  4. Click ‘Execute’
  5. Select the objects you want to transfer, objects will be highlighted in yellow. (In our case we have transported objects under folder “Test Folder 1”)
  6. After Selecting the objects for transport, choose ‘Export Selected’.
  7. User will be prompted to enter a transport request.
  8. Select a transport request and save the changes
  9. Release the transport request in SE09 and import it in QA with STMS

The data sources that begin with ‘SAP_’ are delivered out of the box data sources. Data sources that have yellow highlights are modified by the customers in the current system. Objects in red text are ones that are deleted in the current system.Objects with a blue highlight are new objects created in the current system.

Importing content to quality assurance -

  1. Run transaction ‘SE38’ to execute program ‘/POA/R_PST_TRS_CUST_CTN’.
  2. Partition Namespace: Change from /POA to /SPM
  3. Partition Sub Namespace: Leave Blank
  4. You will now see the objects which have been transported.
  5. Select the objects that you transported, objects will be highlighted in yellow.
  6. To finish the import of the objects, choose ‘Import Selected’.
  7. Logoff from SPM UI, Clear your browser cache and re-login.

I12. After some changes are made to SAP SPM, the reports do not show the updated content

It may help to delete the SPM cache and then restart the portal again. Please consider using the function module OPM_DM_PRECALC_DELETE.


I13. How about the test strategy for SAP SPM implementation?

Deciding the test strategy for the implementation and agreeing the same with the customer in the initial phases is very important. It is so because there are many complex reports in SAP SPM and it may be difficult to compare and match value with the source OLTP system. If the customer has undergone BI implementation previously, their expectation would be to match all report values which may not be possible. We presented the following test strategy to the customer and found it effective.

The test phase was categorized into three sections as follows:

  1. Testing report features - This would include that all features and functions in the SAP SPM application like 'save', 'date range', 'export', etc. work fine. This would ensure usability of the application for an end user.
  2. Data Validation - Since SAP SPM deals with three types of transactional data - purchase orders, contracts and invoices, validate values for these documents both in SAP SPM and in the source OLTP system (SAP ERP in our case). The idea here is to keep the data set as small as possible by adding as many dimension filters as needed. For example, we used the report 'Spend by Supplier' report in SAP SPM and added a date filter and reduced it to a specific date (preferably a weekend, since invoices would be less). Then we pulled out corresponding report in SAP ERP for the same filters and matched the value to 2 decimal places. This can be done for all three kinds of transactional data in an adhoc basis. The objective here is to verify data integrity, that is, for a given date, have all documents/values transferred from SAP ERP to SAP SPM.
  3. Testing reports - This is where expectation needs to be set with the customer. While simple reports like 'Spend by Supplier' or 'Spend by Category' are easy to verify, matching more complex reports like 'Item Price Deviation by Category' may be a daunting task. Hence it should be agreed with the customer early on that in this category, project would only match values for simple reports and customer should have confidence that if step 2 above has passed successfully, the data shown in the complex reports should also be correct. If the customer is still insistent on matching values for all reports, extra time and effort should be budgeted to create parallel reports in source OLTP system for this purpose.

In summary, whatever be the case, the approach and scope of the test phase should be agreed early on with the customer to avoid surprises later.

I14. While executing report as SAP SPM end user, I see an error "Undefined dimension: SAP_0XSAUND_20093134434847”

If this happens, please check the entry in HTTP Destination setup in portal. In our case, the entry there was http://<host>:<port>/sap/poa/sbc/ps/spm whereas the correct entry should be http://<host>:<port>/sap/poa/sbc/ps/SPM. Mentioning 'SPM' in lower case instead of upper case was the mistake. This is a classic example of incorrect basis setup leading to issues later.

I15. What kind of data validation errors should one expect?

During data loading activity, it is common to expect data validation errors related to date format, etc. If one decides to suppress or ignore these errors, you can manage them from SAP SPM portal in Data Management --> Monitor Uploads --> Upload Setup --> Validation Setup. If you decide to make changes to the entries that you see here but not possible from the portal, you can do so in the BW ABAP system table OPMDM_VALI_RULE. However making changes in the BW ABAP system is not recommended and should be done with extreme caution only if needed.

Another common error one can expect during data loading is handling of special characters. These are usually available in objects like purchasing document short text, supplier address, etc. in the source OLTP system. These can either be handled in tcode RSKC in SAP BW or one has to introduce special coding in extractor fields in the source OLTP system. We found it a nuisance during data loading and recommend that it should be handled early on.

I16. As SAP SPM end user, I can see reports in one system but not in the other

Usually one ignores the required Adobe Flash version recommended for SAP SPM which leads to this error. It is important to ensure that supporting software like Adobe Flash, web browser, etc. are in line with what is recommended by SAP


SAP Spend Performance Management is a useful reporting application whose power can be greatly enhanced if the data is enriched from third party agencies like D&B, etc. If properly setup and used, it is an important tool for the CPO and CFO to manage spend related KPIs. With Ariba's Spend Visibility added to SAP's procurement portfolio, it would be interesting to see how these tools collaborate.

Former Member