Getting Data In

Forwarder not recursively indexing in 6.3. Works in 6.2.5. A bug?

Michael
Contributor

I have multiple servers running a Splunk 6.2.5 universal forwarder and it is indexing recursively just fine from /var/log/...

I just installed 6.3, using the exact same install script (very vanilla, nothing fancy), and it is not indexing anything other than /var/log/. Both are running RHEL 6.7.

I tried adding 'recursive = true' but that had no effect.

inputs.conf are identical on both:

[monitor:///var/log]
disabled = false
index = main

I also tried adding a second stanza:

[monitor:///var/log/squid]
disabled = false
index = main
recursive = true
sourcetype = squid

This is especially concerning since this is a proxy server (/var/log/squid/) and it's missing the most important stuff! I would go back to 6.2.5 but I just purged all my older sources... 😕

Am I missing something?
thanx!

1 Solution

Michael
Contributor

Came in this morning, and the indexer is working as it should.

I think it was just a large file and I wasn't giving it time. I didn't think the indexer was under a heavy load at the time, but the clue was the WARN message:

"10-06-2015 15:24:37.496 -0500 WARN TailReader - Enqueing a very large file=/var/log/squid/access.log in the batch reader, with bytes_to_read=878207660, reading of other large files could be delayed"

Thanks Anekkanti, lesson here, RTFEL (read the fine error logs) for clues.

View solution in original post

Michael
Contributor

Came in this morning, and the indexer is working as it should.

I think it was just a large file and I wasn't giving it time. I didn't think the indexer was under a heavy load at the time, but the clue was the WARN message:

"10-06-2015 15:24:37.496 -0500 WARN TailReader - Enqueing a very large file=/var/log/squid/access.log in the batch reader, with bytes_to_read=878207660, reading of other large files could be delayed"

Thanks Anekkanti, lesson here, RTFEL (read the fine error logs) for clues.

anekkanti_splun
Splunk Employee
Splunk Employee

What errors do you see in the forwarders splunkd.log ($SPLUNK_HOME/var/log/splunk/splunkd.log) ?
If there are no errors, I would advise collecting more information to help debug the issue. Please follow:
https://wiki.splunk.com/Community:Troubleshooting_Monitor_Inputs#Collect_information
As far as I know, we aren't aware of such a bug.

0 Karma

Michael
Contributor

BTW: I just "downgraded" back to 6.2.5 from 6.3 on one of these boxes -- and it indexes recursively just fine.

0 Karma

Michael
Contributor

@Anekkanti

only one, from two hours ago:
10-06-2015 13:55:37.897 -0500 ERROR TcpOutputProc - LightWeightForwarder/UniversalForwarder not configured. Please configure outputs.conf.

I am getting one that may be a clue, emphasis below is mine (is it just taking a long time?):
10-06-2015 15:24:37.496 -0500 WARN TailReader - Enqueing a very large file=/var/log/squid/access.log in the batch reader, with bytes_to_read=878207660, reading of other large files could be delayed

0 Karma

anekkanti_splun
Splunk Employee
Splunk Employee

Can you try this command:

splunk list inputstatus

This will list all the files that splunk can see and how its handling them.

0 Karma

anekkanti_splun
Splunk Employee
Splunk Employee

10-06-2015 15:24:37.496 -0500 WARN TailReader - Enqueing a very large file=/var/log/squid/access.log in the batch reader, with bytes_to_read=878207660, reading of other large files could be delayed

This is fine, all splunk is saying here is that it might be busy reading a large file and might be delayed to read other big files.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...