Getting Data In

Is there a way to configure the Universal Forwarder to prevent duplicate events due to a log file that regenerates?

donaldlcho
New Member

I am using the universal forwarder to index a log file that regenerates every time that a new row is added. In other words, the logging mechanism rewrites the entire file periodically; it doesn't append rows to the previous file. The issue that I am having is that when new rows are added to the log file, the entire file is being re-indexed, which results in duplicate event rows. Is there a way to configure this file (in the inputs and/or props configuration files) to prevent this from happening? Thanks.

0 Karma

woodcock
Esteemed Legend

I would write shell script to do a delta of the "real file" and a "copy file" and then do echo $new_lines >> /other/directory/copy_file; rm -f /real/directory/real_file and have Splunk monitor /other/directory/copy_file.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...