Getting Data In

Too many open files on forwarders

oreoshake
Communicator

I'm starting to get a lot of these errors on my forwarders. Any suggestions? Pushing /etc/security/limits.conf doesn't sound ideal.

I'm running heavy forwarders as root.

Tags (1)
0 Karma

netwrkr
Communicator

limits.conf shouldn't come into play unless you explicity have 'nofile' defined.

This is actually a pretty typical issue on the systems I've seen, and quite easily fixed.

'ulimit'

I typically double the number of open files to start.

Using a RHEL/Fedora/Cent OS vi /etc/profile

20,000 number of open files ulimit -n 20000

I then 'source' /etc/profile so my current shell will apply that new value

'source /etc/profile'

Verify 'ulimit -n'

Then restart splunk.

Verify splunk applied the new settings by viewing splunk/var/log/splunkd.log

Looking for this line "INFO ulimit - Limit: open files: 20000 files"

jrodman
Splunk Employee
Splunk Employee

You probably should work with support and/or investigate which files are open by looking in proc, or using lsof. We might be a bit too aggressive in how many files we open in the new tailing code, but that's just a wild guess.

0 Karma
Get Updates on the Splunk Community!

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer at Splunk .conf24 ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...