on 04-28-2016 5:08 PM
Hi ,
We are working on migrating our SAP BW 7.0 environment from HP-UX to AIX. Database is oracle on both source and target.
We are using table splitting and parallel export/import option to reduce downtime on our 3 TB system. We managed to cut down the export/import time to a good extent after some test migrations. But one table /BIC/FC_ZPP_RE is bothering us still which is taking more than 40 hours.
Is there any special way of handling the /BIC tables during export/import ? Any inputs on this will be helpful.
Regards
Krishna
Also, check the indexes (using t-code RSA1) on the cube.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Any InfoCube whose table name begins with "/BIC/F" or "BIC/E" is a customized InfoCube, so you should work with the developers directly to help identify problems with the design of those cubes.
Additionally, InfoCubes are physically implemented using multiple partitioned tables. The tables that begin with "/BIC/F" (F-fact tables) are where the initial loads occur. Each new load request, creates a new physical partition in the F-Fact table. When an InfoCube is compressed, the F-Fact table is written to the E-Fact ("/BIC/E") table. Ideally, you shouldn't have very many individually partitioned loads in the F-Fact table (maybe 2-4 weeks worth at most).
Make sure your BW administrators are compressing their InfoCubes regularly. It sounds like the InfoCube you mentioned may have too many load requests that have not been compressed.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
87 | |
23 | |
11 | |
9 | |
8 | |
5 | |
5 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.