cancel
Showing results for 
Search instead for 
Did you mean: 

Remove duplicate records in CSV file

Former Member
0 Kudos

Hi Experts,

Scenario is CSV to IDoc and I am working on PI 7.31, the requirement is to remove the duplicate records in input file.I thought of doing in two steps in mapping, in first step remove the duplicate records based on two field values using UDF and map it to target with same data type of source. And place the file in local folder. In second step pick the file from the local folder and map it target structure to IDoc structure. Now the issue is in first step where mapping works & removes the duplicates in ESR but in runtime not working & it throws in different structure than required using FCC channel.I think the issue is with FCC channel parameters used. Please find the below snippets on data type, FCC channel snippets of receiver, files of runtime.

Data type same for sender and receiver

Receiver FCC channel "Content Conversion" parameters screen:

Mapping removing duplicates snippet:

All fields under "Record" are occurring in receiver data type but run time file is not as required.

Please find below snippets on files of runtime on both sender and receiver side.

Sender side runtime file:

Receiver side runtime file:

But I just require CSV after removing duplicates on receiver side.

Thanks,Nithin.

Accepted Solutions (1)

Accepted Solutions (1)

former_member182412
Active Contributor
0 Kudos

Hi Nitin

You can do two mappings in the operation mapping without writing the file in local directory

1) first mapping is to remove the duplicate records

2) second one is your actual file to Idoc mapping

Regards

Praveen

Answers (3)

Answers (3)

Former Member
0 Kudos

Hi Praveen, Manoj, Nitin,


Thank you for your response.The end user is from different locations who uploads the file to different FTP paths[sub folder based on location] on same server are the people who are a little educated,so there is a chance of placing the same file again might be with different file name/same name file name. Different filename could be handled by duplicate file handling but what if he places the file with the same name where if I already enabled Duplicate File handling option, so the file will not be picked up. Each file will have multiple records. So please suggest me how to avoid with duplicate file processing in this case?

Thanks,

Nithin.

former_member182412
Active Contributor
0 Kudos

Hi Nithin,

Duplicate file check in file adapter works like below, check below blog for more details.


The duplicate file is identified by the combination of the following file properties.

  • Fully qualified path including file name (with extension)
  • Size
  • Last modified time stamp

If any of these properties gets changed, the duplicate count also reset and considered as a new file

Regards,

Praveen.

nitindeshpande
Active Contributor
0 Kudos

Hi Nithin,

As Praveen and Manoj suggested, you can do it using single interface. Receiver side, it is not working as expected, as the structure in the interface is not matching.

Regards,

Nitin

manoj_khavatkopp
Active Contributor
0 Kudos

Hi Nitin,

First of all you don't need 2 interface to achieve this .

.Create another message mapping in your 1st scenario only i.e source structure to IDOC structure.

.In you operation mapping include this message mapping after the first message mapping which you did for  removing duplicated so that input of second MM will be output of 1st MM.

Br,

Manoj

Edit : Praveen was faster 🙂