Getting Data In

How to forward the entire contents of a CSV file even if its unchanged daily?

avansteen
New Member

Hello,
I'm attempting to forward a set of .csv files for administrator group auditing. However it only forwards, or at least the search only returns changes to the .csv file. For audit reasons, I need the entire contents of the .csv to ingest and not just the changes.

Is there a way to force the forwarder to ignore the fact it already gathered the data?

thanks,

0 Karma
1 Solution

marycordova
SplunkTrust
SplunkTrust

@avansteen

Maybe you could use [batch://<path>] to import and delete the .csv everytime it reads it so that the file is indexed everytime a new one is created regardless of the filename. You would want to just setup up a job that would write out the complete file however often you need; daily, hourly, etc.

I'm not 100% positive but I think it would work and is worth testing. The one issue might be if something in the fishbucket remembers the file and doesn't read the new one.

https://docs.splunk.com/Documentation/Splunk/7.1.2/Admin/Inputsconf#BATCH_.28.22Upload_a_file.22_in_...

Another option...if this is a universal forwarder dedicated to only ingesting this .csv file, you could brute force the thing by creating a script and a cron job to blast out the fish bucket... but this would be dangerous if you are collecting any other data with that universal forwarder

@marycordova

View solution in original post

0 Karma

marycordova
SplunkTrust
SplunkTrust

@avansteen

Maybe you could use [batch://<path>] to import and delete the .csv everytime it reads it so that the file is indexed everytime a new one is created regardless of the filename. You would want to just setup up a job that would write out the complete file however often you need; daily, hourly, etc.

I'm not 100% positive but I think it would work and is worth testing. The one issue might be if something in the fishbucket remembers the file and doesn't read the new one.

https://docs.splunk.com/Documentation/Splunk/7.1.2/Admin/Inputsconf#BATCH_.28.22Upload_a_file.22_in_...

Another option...if this is a universal forwarder dedicated to only ingesting this .csv file, you could brute force the thing by creating a script and a cron job to blast out the fish bucket... but this would be dangerous if you are collecting any other data with that universal forwarder

@marycordova
0 Karma

avansteen
New Member

Worked great once I read the instructions!

move_policy = sinkhole
* IMPORTANT: This setting is required. You must include "move_policy = sinkhole" when you define batch inputs.

Thank you,

0 Karma

marycordova
SplunkTrust
SplunkTrust

awesome 😄

@marycordova
0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...