Getting Data In

inputs.conf -> time_before_close

lpolo
Motivator

Have any of you had the necessity to use time_before_close in inputs.conf. if so could you share your scenario?
I am having an issue with a source log where events could be quite large. Therefore, some events are not broken correctly.

Thanks,
Lp

Tags (1)
0 Karma

sowings
Splunk Employee
Splunk Employee

I have a log file which is a large XML document comprised of various sub-documents that take a while to run. Each job writes its data to the file as the output is generated, but the whole XML document isn't closed (appropriate closing tags, etc) until the whole set of jobs is complete. Sometimes, the writing of the log will pause for more than 3 seconds (the default value of time_before_close), and so Splunk was consuming that file half-way through.

If you're seeing events broken before they're complete, consider MAX_EVENTS (it defaults to 256 additional lines, so if you have those multi-line events showing a linecount of 257, this could be the issue), or possibly TRUNCATE.

0 Karma

lpolo
Motivator

Thanks for your comment. It is not my scenario.

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...