cancel
Showing results for 
Search instead for 
Did you mean: 

how to reduce the time for extraction.?

Former Member
0 Kudos

data extracting from SAP R/3 to SAP BW ,in SAP R/3 system having 500 records,but it takes five hours time fro extacting ,how to reduce the time for extraction

Accepted Solutions (0)

Answers (4)

Answers (4)

Former Member
0 Kudos

Lekha

Your Job must have been failed. Please check in SM37 and ST22 transactions. For 500 records it won't take that time.

Please check this link on Performance

https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2...

Thnaks

Sat

Former Member
0 Kudos

Hi All

We are having a problem just as Lekha is facing. However in our case, most of the time is taken when updating the data into Fact table. Extraction from flat file, passsing through transfer and update rules is very fast (in secs), but inserting the records into F-fact table it takes roughly 2hrs for every 20000 records(which is the size of the data package.

Can you pls suggest the way out? Thanks in advance

Raj

Former Member
0 Kudos

Hi,

a) Check in transaction st03n (Expert Mode) where exactly the time is used. Make sure that statistics are switched on for WHM and OLAP (TRX: RSA1 -> Strg-Shift_F10).

Else, check the monitor-details.

occasionally it's better to drop indexes before update and built them up afterwards. Check in transaction st03n

b) check the modelling:

in SE38=> Program SAP_INFOCUBE_DESIGNS.

Check if the number of entries in Dim-Tables is less then 10 % of the fact-table.

check if line-item-dimensions are used properly or if you can use some. In ORACLE and DB6 you can use B-TREE indexes for high cardinality.

c) Use Batch-Processes: Post your FlatFile into the FileSystem (AL11) per FTP.

d) think about compressing the f-table. Check your Database. Is partitioning allowed? If yes, create Partitions for your cube when maintaining it.

e) check traffic in the system during update => what's going on in the system when updating.

try these first steps.

Cheers

Sven

Former Member
0 Kudos

Hi Sven,

Let me check the options and get back to you.

Thanks as of now

Raj

Former Member
0 Kudos

Hi Sven,

I could able to collect the following facts on investigating –

a) I already know that max time is taken in inserting the records into Fact table and nowhere else.

b) One of the dimension on the cube contains 47% of the Fact table. This dimension is marked for High Cardinality. Also this index uses non-unique index. Further this dimension is expected to contain data records 160,000 to 660,000 while it has nearly 24 million records. There is another dimension of size 16% of the fact table and this not marked for high cardinality.

c) The point you mentioned in (c) may not be applicable in my case, if I a m not wrong.

d) The fact table is compressed. F-table contains 10% of records while E-table contains 90%.

e) Traffic is not a problem as it is the same before and after this problem started.

Please pour in your thoughts and experience..

Regards

Raj

Former Member
0 Kudos

Hi,

hmmm, the cube seems to be filled for a while...

before going on reading, check wether the indexes are existing on the database via se11 and check the existance of the statistics in the performance-TabStrip of the "Manage"Dynpro of the cube. If the performance is poor, recalculate it with a highter percent rate (up to 20 or 25 %).

I assume that you're running on a ORACLE or DB6, otherwise the "high cardinality" wouldn't make any sense. Other DBs creating a bitmap index, even if you have activated the high cardinality.

as far as I can tell, two options from an application point of view, but for a) the depletion of the cube is required, for b) not.

a) Redesign the Cube.

(By the way, what are the specific InfoObjects in the unlovely Dimensions?)

User line-item dimensions if possible (any dimensions left? - then you have direct access to the table during upload). If not possible, rearrange the characteristics

in a way that you dont' ecceed the 10% rule.

when doing this, change the expected size of the Dimension-table to 3 or 4. Befor doing that, check the current size via TRC DB02.

when having depleted the cube, create partitions (presupposed you have a oracle DB underneath). This will speed up the compression a bit as well.

Ask the basis-people to post the FlatFile into the FileSystem. They'll be allowed. Then you can upload the data in background.

b.) Create a copy of the cube ...

... and merge them in a multiprovider. Queries can be copied via RSZC. Authorisations have to be enhanced as well.

Within the copy, use line-items and everything as described above.

Finally create a by-pass for the upload. Upload the Copy-Cube from now on and swith off the update of the old cube.

please let me know if that helped.

cheers

sven

Former Member
0 Kudos

Hi Sven

Thanks for the reply. I will get back to you on this.

Raj

Former Member
0 Kudos

Hi Sven,

We have not yet worked on any solution. We are in the process. Your idea seems quite interesting. I have proposed your thoughts along with mine. Iam waiting for the feedback. The moment i have some information i will share it with you. Please give me your mail ID so that i can keep you informed if this process of implementing delays.

Iam also awarding the maximum points to you.

Cheers... keep up your best efforts.

Thanks

Raj

Former Member
0 Kudos

Hi,

it's a pleasure to me.

soe here's : sven.muntermann@email.de .

I hope we'll find the solution.

cheers

Sven

Former Member
0 Kudos

Hello lekha,

please refer to SAP OSS Note 417307 titled 'Extractor Package Size Collective Note'.

there is a document: 'data loading performance and analysis'

https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2...

hope this helps.

thanks,

Former Member
0 Kudos

Hi,

where's the time lost? In BW or in R/3?

Check the Monitor-Details. When highlighting some of the entries in the monitor, you can quickly detect when the step was performed.

If R/3 (what I expect when reading your mail), restart the upload, logon in SOurcesystem and go to trc SM66 and search your extraction-process.

What table is scanned?

And could you name me the extractor/dataSOurce in Use please.

Cheers

Sven

Former Member
0 Kudos

Hi,

500 recs 5 hours ? look if the extraction is still running, ST22 if there are any dumps.

/manfred