Skip to Content

Archived discussions are read-only. Learn more about SAP Q&A

Performance Issues

Dear all,

I am currently investigating some performance issues after reading some useful documents around the web (like 5 Tests for Top Performance by Jason Dove) although I have not found a useful complete and in-depth guide about this really important matter.

In the company where I work, while I am running my reports, the users of the application which uses the same DB, experience lack of performance!! Believe me they are very polite people, and tend to tell me just few times a week but I can easily imagine their impatience and frustration (hope not against me lol )

My reports basically consists on 6/7 sub-report so using the Performance Timing (under Report|Performance Information tab) I can easily track which sub-report would be under heavily investigation. So far I have discovered 2 or 3 sub report which have the Value "Read Database Records" above 2000 ms (that I consider my threshold limit value).

  1. Subreport 1 had a selection Criteria which could not be translated correctly in SQL statement: basically two or more conditions were written as an {field }in ["A","B"], i changed in Like (more understandable by the SQL engine) and now the performance are improved substantially (form 6-7 thousand to 500 ms).
  2. Subreport 2 consist on a Stored Procedure, I believe the Selection Criteria which is launched, is really run against the data already processed by the Store Procedure therefore the "Read Database Records" are pretty high. I ask to any guru of this community if my statement makes sense, i would like to be sure before make any change which can affect manifold reports.
  3. Subreport 3 has a selection Criteria which I cannot make translate in SQL, see following in blue color:

{Clients.Client_Account} = {?Client Account} and

(

(IsNull({Contract_Header.C_Type}) = TRUE) or

(not({Contract_Header.C_Type} like ["MATERIALS","SOFTWARE SUPPORT"]))

)

The last general question is the one which really bother me during my test phase: I am running several times my reports and checking every time the figures enlisted in the Performance Timing and comparing it with the previous figures i obtained. It is possible that the latest run had high performance because the cache was "warm"? If yes how can i make more realistic tests?

Thanks in advance

Best Regards

Luca

Former Member
replied

Hi Luca,

You cannot do much with reports that have multiple subreports other than making sure the selection criteria they contain are passed back to the database or at best, try and reduce the no.of subreports, if possible.

I'm pretty sure you know how subreports degrade performance depending on where they're placed on the reports. It can't get any worse with reports that return large volumes and have subreports placed either on the details/group sections and the subreports themselves return large recordset.

Getting into damage control mode, Subreport 2's record selection needs to be gotten rid of! You need to find a way to move this criteria to the Stored Procedure's where clause. Like you correctly guessed, the selection is processed locally after the records are returned. Now imagine this process being repeated over and over again for each section that runs this subreport.

Subreport 3's selection criteria could be written as:

(

     (IsNull({Contract_Header.C_Type}))  or

     (not({Contract_Header.C_Type} like ["MATERIALS*","SOFTWARE SUPPORT*"]))

)

AND

{Clients.Client_Account} = {?Client Account}


I've never encountered 'accurate' results using the 'Performance timing' runtimes in CR - they always seem to vary between each run.


The general rule of thumb is to:

- reduce the number of subreports

- process selection criteria on the database side

- database logic arithmetic/string expressions in the sql query instead of creating report level formulae

- and there are few others too


-Abhilash

0 View this answer in context
Not what you were looking for? View more on this topic or Ask a question