cancel
Showing results for 
Search instead for 
Did you mean: 

Tables Updatation in BDLS run

Former Member
0 Kudos

Hi All,

Looking at the BDLS run logs on our SAP system, looks like only a small number of tables are updated among the tables it checked. For example:

ECC: 90 out of 1537 tables were updated

CRM: 85 out of 1683 tables were updated

BI: (for ECC), 48 out of 830 tables were updated

(for CRM), 86 out of 830 tables were updated

I was thinking if we can specify the tables to be converted, then the BDLS run will be much faster. But then wonu2019t know for sure which tables to specify.

So please suggest is there any way to figure out the table name which will be Updated by BDLS run so that we can mention those table names only in BDLS execution and reduce its execution time.

Regards,

Shivam Mittal

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

I would run it for the whole range and not selected tables. But you may be able to get the list from test run. Please check performance section of following link.

[http://help.sap.com/saphelp_sm32/helpdata/en/33/c823dbaea911d6b29500508b6b8a93/content.htm]

Former Member
0 Kudos

Well to find the records that need to be adjusted is one of the main big parts of BDLS. Depending on your data the tables involved are different, and we cannot give you a list.

But it is possible to speed up BDLS, especially if you do it regularly. Check this blog for details: [Execute conversion of logical system names (BDLS) in short time and in parallel - Intermediate|http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/4796] [original link is broken] [original link is broken] [original link is broken];

Cheers Michael

volker_borowski2
Active Contributor
0 Kudos

Hi,

Having Indexes on the big tables helps a lot.

if you have the resources, create them in parallel with

create index .... compute statistics online nologging parallel 12;

This way the stats are sampled during creation and there is no need for additional stats calculation.

Do sql monitoring while BDLS runs to detect the expensive statements that need adjustment

I have a system down to 35minutes from 16 hours.

I need 45 Minutes to create all indexes beforehand (most of them take 1-3 minutes with parallel 12) with sqlplus.

So it is a 14+ hours gain.

Having noarchive mode during bdls helps, if you have really many record to convert like

on BW systems or with a big workflow usage.

Volker