Getting Data In

Why is our Splunk 6.3.2 forwarder locking itself out of a catalina.out archive file every morning?

mmcduffie
New Member

Every morning the Splunk forwarder on our servers locks itself out of a file and consumes quite a bit of CPU churning over and over trying to access it. Moving the file out of the reach of Splunk resolves the problem.

Splunk Forwarder Version
6.3.2

lsof command showing lock on file

splunkd 19690 root 53r REG 202,1 7718 59481 /var/log/tomcat/catalina.out-20160217.gz

Spam from log file

02-18-2016 13:14:57.344 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:14:57.344 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:14:57.444 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:14:58.445 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:14:58.445 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:14:58.543 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:14:59.544 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:14:59.544 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:14:59.645 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:15:00.646 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:15:00.646 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:15:00.748 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:15:01.748 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:15:01.749 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:15:01.847 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:15:02.848 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:15:02.848 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:15:02.945 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
0 Karma

jcrabb_splunk
Splunk Employee
Splunk Employee

This is a known issue which is fixed in 6.3.3.

SPL-108219: File deadlock. Can cause repeated entry in splunkd.log which contains, "is already locked with state=". In certain conditions, deadlock can result in elevated CPU usage.

6.3.3 Release notes: http://docs.splunk.com/Documentation/Splunk/6.3.3/ReleaseNotes/6.3.3#Unsorted_issues . Please upgrade and see if that resolves the issue.

Jacob
Sr. Technical Support Engineer
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...