cancel
Showing results for 
Search instead for 
Did you mean: 

Data load from Infocube to Infocube 170 million records.

Former Member
0 Kudos

Hi All,

I want to load the data from 1 infocube to another infocube. Now I have developed in Development. But in production if I start the Full load it will load or any time out error it will show? bcz 17 crores records I have to load from old infocube to new infocube.

Please advise any precaution should I take before start load from old to new infocube.

Thanks in Advance

Shivram

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi

best way to do this activity is to load data selectively in chunks of say months or couple of months.

you can use Full DTP with filters on months.

Regards

Sudeep

Former Member
0 Kudos

Hi Vishal & Sandeep,

I am in BW 3.5 version.

Thanks&Regards,

Shivram

Former Member
0 Kudos

In that case, you'll have to use Export Data source for the Old Cube & create the required transfer rules/ update rules.

Then use the infopackge & do a selective load, either full or delta.

Former Member
0 Kudos

Hi Shivram

As mentioned above then you need to create transfer rule, update rule between export datasource of old cube and new cube.

then use full infopack with selection and load data in chunks.

Regards

Sudeep

Former Member
0 Kudos

Hi All,

Thank you for your quick response.

As I will create export data source then I will load from OLD to NEW infocube. But my question is If I load entire data as Full load (17 crores records), It will take dump from OLD cube and it load as single request in NEW CUBE. I think it may get time out the request. bcz It is loading 17 crores records as single request. I am not sure. Please advise.

Thanks in Advance,

Shivaram

Former Member
0 Kudos

Hi,

Do the following:

1 - Identify in your 2nd Cube any characteristics based on which the entire data can be broadly divided like Plant, Fiscal year, etc.

2 - Make sure you delete the F fact table indices before loading. You can do this either through a process chain, if you are using one, or through manually deleting indices from Performance tab in cube manage.

3- In the IP from 1st Cube to 2nd cube, give selections bases on the identified char in step 1 and run the IP multiple times.

The data will hopefully be loaded without hassles.

Former Member
0 Kudos

You need not load the entire 170 mil records at a go.

Please do a selective loading, i.e. say, based on Doc Number or Doc Date or Fisc. Period or Cal month, some characteristic like this which is unique for all records.

This will ensure that the data is getting loaded in small amounts.

As said above, what you can do is, create a process chain.

Drop indexes from the 2nd cube. Make multiple infopackages, with different selections, placed one after the other in the process chain, loading the data into cube 2.

Then build your indexes, after the loads are complete i.e. after 170 mil records have been added to cube 2.

Former Member
0 Kudos

You can make use of a Delta DTP and then use the property 'Load Delta Request by Request'.

Like this, you need not do a Full Load & then the data will be loaded as per the requests in the Old cube, one by one.