Getting Data In

batch upload not working - files not being consumed

skattamu
New Member

I am trying batch upload like this from a light forwarder. But the files are not being consumed (there are only 2 small test files). Am I missing a key attribute? version 4.1.3, build 80534

[batch:///var/log/archived_files]
move_policy = sinkhole

Thanks.

0 Karma
2 Solutions

lguinn2
Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

View solution in original post

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

View solution in original post

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

0 Karma

Jason
Motivator

If batch follows the same logic as monitor, if you put the same file into a batch input twice, will you have to change crcSalt to make Splunk eat the file again?

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

Order shouldn't matter, nor should the monitor stanza be required.

0 Karma

skattamu
New Member

Thanks. It started working when I used this stanza (apparently the order mattered)
[batch://]
disabled = false
move_policy = sinkhole

0 Karma

lguinn2
Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...