cancel
Showing results for 
Search instead for 
Did you mean: 

Record filtered in advance as error records with the same key exist RSM2

Former Member
0 Kudos

Hi All,

Need your guidance to get rid of the following issue that I am facing. Appreciate your early help and response on this. Scenario is:

We have a DSO (level 1, write optimized) and transferring transaction data to another standard DSO via DTP. For some of the transaction data we do not have master data set up(which is be going to set up in near future, not an issue right now) and that is why couple of records ending up as error requests in error stack upon every transaction data load in a day.

Earlier whenever I used to run the Error DTP after a week time interval, all the error requests present in error stack were used to convert into a single common consolidated error request. Now, what is happening is that, upon execution of Error DTP pertaining to this standard DSO, load is ending up with a red status.

For first occurrence of record with particular semantic key in error request, error description is that no master data maintained for the related data, BUT, for all the following occurrences of records with same semantic key, in same error request, error description is:

u201CRecord filtered in advance as error records with the same key exist RSM2 722u201D

Long Text:

Diagnosis

The data record was filtered out because data records with the same key have already been filtered out in the current step for other reasons and the current update is non-commutative (for example, MOVE). This means that data records cannot be exchanged on the basis of the semantic key.

System Response

The data record is filtered out; further processing is performed in accordance with the settings chosen for error handling.

Procedure

Check these data records and data records with the same key for errors and then update them.

Could you please suggest a solution to get rid of the above mentioned error? Awaiting your reply.

Accepted Solutions (0)

Answers (1)

Answers (1)

KamalMehta
Advisor
Advisor
0 Kudos

Hi ,

As you know that as the records which were collected in Error stack in previous request so whenver the DTP is running requests are getting collected there again and again so as to avoid inconsistency in the data based upon the semantic key selection .

As there is no option of deleting the request from the Error stack so You should correct the records in earlier request (the older one ) if possible and then run the error DTP . If you dont required these records at all then you can delete the same from the data target .

Hope this helps .

Regards

Kamal

Former Member
0 Kudos

Hi Kamal,

Thanks for looking after this.

You are right that we can go for the deletion of previous successfully updated error request from the DSO but, from here onwards it is further being staged to another cube. So we can not go for this option as there had been several delta updates into the DSO since the last one month, after last successful error DTP run.

Could you please advise further to resolve this.

Thanks,

Megha

KamalMehta
Advisor
Advisor
0 Kudos

Hi Megha ,

I dont see any other way for doing the same .

For the time being you can delete all the above targets as by doing this all the reQUEST in the Error stack should get deleted .

Later you can reload all the above data targets with the help of the DTP and enabling the Error DTP witht the option 'Valid records update reporting possble green '.

So now you would be having 1 request only in the error stack .But again going forward you have to correct and run the Error DTP again .

Hope it helps .

Thanks

Kamal Mehta

Former Member
0 Kudos

Hi Karan,

Thanks for your reply but a different problem arises if I go ahead with this proposed solution. Actually, we are extracting data through flat files on a daily basis and individual file gets generated every day. So, I will not be able to extract a full load with complete history data after complete deletion of data from all the targets.

Any further suggestion on this ?

Thanks,

Megha

KamalMehta
Advisor
Advisor
0 Kudos

Hi ,

Are you deleting the PSA also .I mean even if you are doing it from flat file but still if you are not deleting the PSA then you should have all the request in the PSA and you can load it from there .

Thanks

Kamal

Former Member
0 Kudos

Hi Karan,

I have only past 60 day's data in the PSA that's the reason I can't get a full load.

Thanks for your consistent guidance into this.

Thanks,

Megha

Former Member
0 Kudos

Hi All,

Any luck on this ?

Thanks,

Megha