Data set issue
I have a file on appl server (score file) which i am reading with open dataset 'Read'. after reading the flatfile i do some manipulations using progeram on this imported file. now this manipulated file is sent back to appl server using open data set 'Transfer', so a new file is created on appl server.
This new file created on appl server is uploaded into sap using a standard report using RPTIME00.
Now my problem is the user is running this report unknowingly multiple times. The data in the scroe file is getting filled into the newly created file this is causing the interface to run slow.
(eg scroe file has 1000 records these records are transfered to new file, if multiple times the user runs this interface the new file gets 2000 records , 3000 records and so on this is causing low memory problem)
I want to know wheter we can delete the file created by program after upload or do we have modes in which we can stop this duplicate data entry)
URGENT Pl help
reward pts will be given