Skip to Content

Archived discussions are read-only. Learn more about SAP Q&A

Performance Issue on 0FI_GL_14 Extraction

Hi experts,

We have a client scenario whose wants to see the report in a daily basis.

Unfortunately, the report consists of those data from a standard extractor named 0FI_GL_14 which involve with really huge SAP tables such as FAGLFLEXA.

The current BW system (SAP BW 7.01) schedules a daily job for this extraction daily and it takes approximately around 12 - 16 hours for daily data load. Yes, the table is really huge and there are approximately more than 300 million records reside in the table.

Therefore, we cannot leverage this daily data to be reported in time and would like to improve extraction performance.

We are considering two options to resolve this matter.

  1. Develop a customize data source (ABAP program) in SAP which will only extracts necessary data. However, we still need to query data with Posting Date up to 2 months back, the performance of this program might be worse.
  2. We have researched that SAP Note 1531175 might be helpful for this issue. It states that we should create a secondary index on the table. However, since the table is really huge and the functional user concerns on the impact of creating the index whether it will impact their existing reports, performance of their transaction, downtime required during the index creation, etc.

Would you please have any suggestion or advise in regard to this? We would like to get back to the client to recommend with the better solution.

Thanks in advance.

Ps. Figure attached is a current configuration for the mentioned table.

Tags:
Former Member
Not what you were looking for? View more on this topic or Ask a question