on 05-31-2011 8:17 AM
Hi All,
I want to load the data from 1 infocube to another infocube. Now I have developed in Development. But in production if I start the Full load it will load or any time out error it will show? bcz 17 crores records I have to load from old infocube to new infocube.
Please advise any precaution should I take before start load from old to new infocube.
Thanks in Advance
Shivram
Hi
best way to do this activity is to load data selectively in chunks of say months or couple of months.
you can use Full DTP with filters on months.
Regards
Sudeep
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi All,
Thank you for your quick response.
As I will create export data source then I will load from OLD to NEW infocube. But my question is If I load entire data as Full load (17 crores records), It will take dump from OLD cube and it load as single request in NEW CUBE. I think it may get time out the request. bcz It is loading 17 crores records as single request. I am not sure. Please advise.
Thanks in Advance,
Shivaram
Hi,
Do the following:
1 - Identify in your 2nd Cube any characteristics based on which the entire data can be broadly divided like Plant, Fiscal year, etc.
2 - Make sure you delete the F fact table indices before loading. You can do this either through a process chain, if you are using one, or through manually deleting indices from Performance tab in cube manage.
3- In the IP from 1st Cube to 2nd cube, give selections bases on the identified char in step 1 and run the IP multiple times.
The data will hopefully be loaded without hassles.
You need not load the entire 170 mil records at a go.
Please do a selective loading, i.e. say, based on Doc Number or Doc Date or Fisc. Period or Cal month, some characteristic like this which is unique for all records.
This will ensure that the data is getting loaded in small amounts.
As said above, what you can do is, create a process chain.
Drop indexes from the 2nd cube. Make multiple infopackages, with different selections, placed one after the other in the process chain, loading the data into cube 2.
Then build your indexes, after the loads are complete i.e. after 170 mil records have been added to cube 2.
You can make use of a Delta DTP and then use the property 'Load Delta Request by Request'.
Like this, you need not do a Full Load & then the data will be loaded as per the requests in the Old cube, one by one.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
86 | |
10 | |
10 | |
9 | |
6 | |
6 | |
6 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.