Getting Data In

Too many open files on forwarders

oreoshake
Communicator

I'm starting to get a lot of these errors on my forwarders. Any suggestions? Pushing /etc/security/limits.conf doesn't sound ideal.

I'm running heavy forwarders as root.

Tags (1)
0 Karma

netwrkr
Communicator

limits.conf shouldn't come into play unless you explicity have 'nofile' defined.

This is actually a pretty typical issue on the systems I've seen, and quite easily fixed.

'ulimit'

I typically double the number of open files to start.

Using a RHEL/Fedora/Cent OS vi /etc/profile

20,000 number of open files ulimit -n 20000

I then 'source' /etc/profile so my current shell will apply that new value

'source /etc/profile'

Verify 'ulimit -n'

Then restart splunk.

Verify splunk applied the new settings by viewing splunk/var/log/splunkd.log

Looking for this line "INFO ulimit - Limit: open files: 20000 files"

jrodman
Splunk Employee
Splunk Employee

You probably should work with support and/or investigate which files are open by looking in proc, or using lsof. We might be a bit too aggressive in how many files we open in the new tailing code, but that's just a wild guess.

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...

Updated Data Management and AWS GDI Inventory in Splunk Observability

We’re making some changes to Data Management and Infrastructure Inventory for AWS. The Data Management page, ...