Getting Data In

Splunk Universal Forwarder using 2GB of RAM?

kendrickt
Path Finder

Hi guys,

I've just installed the Universal Forwarder on my NAS server(Windows Server 2008 R2) and I have configured it to read files from a directory.

The directory it is reading from contains 209,000 800byte files dating from November 2013 to now. I told the forwarder to give me the last 6 months.

I noticed that during forwarding, the Universal Forwarder was using 2,072,XXX kilobytes. I assumed that this was just because it was forwarding files containing over 2.5million events.

Once forwarding was complete and my indexer had the complete set of data from this forwarder, I expected the RAM utilization to drop, but it hasn't.

Splunk seems to be stuck at just under 2GB, I've restarted it many times and no luck, it just climbs straight back up to 2GB and stays there.

If I disable the app that looks into the directory that has this large amount of files, the forwarder only uses 50Mb.

The question is: how can I keep this RAM utilization down? I need that directory monitored.

Couple of things to note:

The inputs.conf is using crcSalt= and initCrcLength=1000 - I think this may be relevant but the forwarding will not work without it.

1 Solution

alacercogitatus
SplunkTrust
SplunkTrust

I've seen this before as well. Size of usage directly correlates with number of files monitored. The more files, the more memory. You may want to remove the files that are > 6 months old, I mean, you have Splunk to store those contents, no? why keep the raw data around? Or if you have that many active files, good luck 😄

View solution in original post

kendrickt
Path Finder

I've disabled the forwarder for now - I can't re-enable it until I enter a new change window after the weekend.

Thanks for the suggestions so far. I will update this thread when I have more information.

0 Karma

kendrickt
Path Finder

I just added ignoreOlderThan=3h to the inputs.conf

RAM Utilisation is still at 850Mb. There has to be another solution...

0 Karma

kendrickt
Path Finder

1 every 5 minutes, so approximate 36 800byte files. Shouldn't take 850mb of RAM?

0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

Goto your Task Manager, and in Processes, add the column "Handles". How many for splunkd.exe ?

0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

How many files are newer than 3h old?

0 Karma

kendrickt
Path Finder

Can't seem to add a comment to your answer, alacerogitatus, so I'll comment here.

This is a production system I'm dealing with - the files in this directory are used by atleast 2 other applications.

Yes - The files should be tidied up and perhaps there shouldn't be that many in a single directory.

No - It's not something I would be easily authorised to do, I could tell Splunk to ignoreOlderThan=1d but I think the problem will still exist down the line.

0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

I don't think its "Files per Directory" more of "Total Files". I would try the ignoreOlderThan Flag, as then if the timestamps aren't updated, it shouldn't look at it at all.

0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

I've seen this before as well. Size of usage directly correlates with number of files monitored. The more files, the more memory. You may want to remove the files that are > 6 months old, I mean, you have Splunk to store those contents, no? why keep the raw data around? Or if you have that many active files, good luck 😄

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...