on 09-05-2008 8:07 PM
Hello,
I am working on a client system and they have a problem with the data in one of their info cubes. The problem is that there are duplicates of each data record in the cube.
The data being loaded comes from 2 places... a data store and another info cube. Data coming from the data store is fine, as well as the other info cube. It's only when the data is loaded into the final info cube do we see duplicates.
We've already tried deleting and reloading the data as well as clearing out the psa but no luck. We've also debugged all of the routines and update rules but they are all working fine.
No one knows the answer... any ideas? I've run out of ideas... thanks.
The data going into the final infocube is coming from the PSA... don't ask why. Apparently this solved earlier issues so they decided to load from the PSA instead of the dso. Could this be causing the double up?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I was able to fix the problem by removing the PSA from the update, and only updating the data targets. I'm still baffled as to why this occurred though and would like to find out because there are plenty of other objects in the system that update from the PSA as well... The data in the PSA looks fine. Could it be that for some reason it is updating from the PSA and the datastore... thus inserting 2 or each record?
Brian: no, the data is being loaded from two different sources and looks different in the dso an cube before being loaded into the other cube for reporting.
Arun: as far as I know, this is a normal cube
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
check this SAP notes on relevance:
Extracting from InfoProviders with navigation attributes
SAP Note Number: [1145673|https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313435363733%7d]
P10:P33: Delete InfoCube and request started repeatedly
SAP Note Number: [983467 |https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d393833343637%7d]
30BSP17: Duplicate records due to reading compressed data
SAP Note Number: [674309|https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d363734333039%7d]
Regards
Andreas
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Strange since the records in the infocube are request based and the request from the cube will be different from the request from thr DSO.
Are you using a standard Cube for this ?
Arun
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
How are your DSO and Cube being loaded? from the same datasource? If that's the case, then it is understandable to have duplicate data because you are breaking it into two identical data paths, and then combining them again .... essentially doubling the data.
If it is only a subset of the data, try to identify the subset and restrict it out of either the DSO or cube. I would choose the slowest load and hope it speeds it up.
Brian
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
87 | |
10 | |
10 | |
10 | |
7 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.