Getting Data In

How to prevent error "Bug during applyPendingMetadata, header processor does not own the indexed extractions confs."

krdo
Communicator

Hi,

We are seeing lots of the following errors on our forwarders:

11-21-2016 06:23:13.425 +0100 ERROR TailReader - Ignoring path="D:\LogFiles\4536214783912789-MyFile.csv" due to:   Bug during applyPendingMetadata, header processor does not own the indexed extractions confs.
11-21-2016 06:23:13.425 +0100 ERROR WatchedFile -   Bug during applyPendingMetadata, header processor does not own the indexed extractions confs.

All the files mentioned in the error messages have one thing in common; they are empty and contain only the unicode byte order mark (3 bytes). The files are created by a logging framework and we currently have no way to prevent the files from being created. Also, the files are never overwritten, so there is not need to index those files at all.

Is there a way to tell splunk to ignore files containing less than e.g. 4 bytes? Is there an other way to prevent the error?

PS: The question https://answers.splunk.com/answers/321310/splunk-forwarder-file-monitor-is-not-detecting-new.html describes the same problem, but the "solution" does not work for us as we don't want to index the files - we want splunk to ignore those files.

0 Karma

abhinav_maxonic
Path Finder

I am also getting this error for my csv files. Splunk is not indexing some csv files. All for them are 117KB in size. I am creating the CSV on linux using command -

ssh admin@machine1 "some command" > /opt/script_output_data/folder1/folder2/file_name_`date +\%m\%d\%Y_\%H_\%M_\%S`.csv

Error : 01-15-2017 21:40:22.148 -0800 ERROR TailReader - Ignoring path="/opt/script_output_data/folder1/folder2/file_name_01152017_21_40_18.csv" due to: Bug during applyPendingMetadata, header processor does not own the indexed extractions confs.

0 Karma

woodcock
Esteemed Legend

This to me looks like splunk cannot read the splunk configuration files (e.g. props.conf) because the process which is running splunk is not root and the files do not have the proper ownership/permissions.

0 Karma

krdo
Communicator

That's what i thought too at first, but the error is only reported for emtpy data files. I'm pretty sure splunk can access the conf files because it applies the various settings specified in the files just fine.

0 Karma

maciep
Champion

what are you trying to index that is matching those files? Is there any way to identify these files to blacklist them in your inputs.conf?

0 Karma

krdo
Communicator

Those files contain CSV formatted data about how our application performs (KPIs). The files are generated periodically and once created are never touched again. The "empty" files are generated by the logging framework we use on application startup (which happens a lot on our systems). The only way of identifying those "empty" files is the file size - the file name pattern is the same as for files containing data. So blacklisting is not an option here...

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...