on 11-16-2015 9:56 AM
Hi all,
what we want:
We have an output stream and want to write it's contents to a file (csv or json). This file is used as kind of a log file for custom error messages and warnings.
In order to limit storage consumption we would like to limit the amount of files that can exist at the same time. Whenever the limit is it the oldest file shall be deleted.
we are currently using the File/Hadoop CSV as follows:
ATTACH OUTPUT ADAPTER csv_error_output
TYPE toolkit_file_csv_output
TO output_stream_error
PROPERTIES
dir = '<someFolderName>',
file = 'log',
timeBasedRotate = true,
timestampInFilenames = true,
timeBasedRotateInterval = 3600,
timestampInFilenamesFormat = 'HH',
;
Expected result would be files in the following format:
log.01 - for 1AM
log.02 - for 2AM
log.03 - ...
...
Where the number represents an hour. This would lead to overwriting the log files after 24 hours.
However ESP always adds a counting number to the files like so:
log.001.01
log.002.02
log.003.02
...
log.325.08
which should actually only happen if you set the fileSizeLimit - Property which we did not do (according to the documentation).
Is there an alternative way or a better solution?
Thanks,
Sven
I have the same problem. If somebody know the resolution about it, please let me know.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
86 | |
10 | |
10 | |
9 | |
6 | |
6 | |
6 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.