Getting Data In

How to prevent Splunk from mixing event timestamps from multiple concurrent scripted inputs?

romedome
Path Finder

I have 6 scripted inputs that use the same script, but with different arguments and I'm noticing that it's mixing the events. This seems to happen when the previous script instance finishes after the next has already started. When this happens, I'll see the first event come in with two timestamps (its own and the next) and the next event will have no time stamp at all 😞

I'm using the same source and sourcetype for the 6 script stanzas in inputs.conf. How can I make sure that Splunk keeps is able to distinguish between executions when parsing the events?

0 Karma

muebel
SplunkTrust
SplunkTrust

Hi romedome, this seems like it is a bug, and you probably should submit a support ticket to verify.

As to a more immediate workaround, a few options come to my mind:

  • Incorporate a locking mechanism into the script, some while loop at the start that looks for some indication another instance of the script is running, and will wait until it's clear. Still might end up with some race-type stuff, but it might be enough to get around the issue
  • Create 6 copies of the script and assign each one to a different input. This is kind of kludgy, but it seems like it might work.
  • Space out the scripted inputs so that they don't run at the same time. i.e., one runs at 5 after, then next at 10 after, and so on. Obviously this will depend on how often you need each input to run, and how long the script takes to run, but it might work.

Please let me know if this answers your question! (or helps in any way at least)

romedome
Path Finder

I think you might be right, this looks like it could be a bug. Yesterday I gave each exec stanza in the inputs.conf file a slightly different source value and it stopped mixing the output from the instances. I previously had 6 separate scripts but in an attempt to make this a more "elegant" solution I consolidated into a single script and this is when this issue reared it's head.

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...