All Apps and Add-ons

How would the Kafka Messaging Modular Input behave if a Splunk failure occurs?

markbatesplunk
New Member

Hi,

I have a question from our SPLUNK team. We intend on using the Kafka Messaging Modular Input to ingest Kafka events. If a SPLUNK failure occurs, how would the modular input behave? The events we are consuming are being used to reconcile an end to end process, so we need to know if the modular input would continue to ingest events from KAFKA and then fail when it tried to index those events, or would it be aware of the problem with Splunk and stop ingesting Kafka events?

0 Karma
1 Solution

ryanoconnor
Builder

Part of the answer to this question may depend on how your deployment is configured.

The modular input for this TA is part of Splunk. The modular input in this case is monitoring the stdout of the kafka.py file. If the Splunk process has some sort of failure, than likely you will stop ingesting data as well.

Indexing may still be going on if a separate Splunk System is handling your indexing, however if the Splunk system that has that modular input for kafka fails, than indexing won't be happening for your Kafka data. Other day may still be coming in though if indexing is happening on another system.

You can setup two different types of monitoring that might help detect these sorts of failures.

  1. Process monitoring outside of Splunk using something like Zabbix.
  2. You could also run some sort of scheduled job in Splunk to watch for lapses in data from Kafka.

View solution in original post

ryanoconnor
Builder

Part of the answer to this question may depend on how your deployment is configured.

The modular input for this TA is part of Splunk. The modular input in this case is monitoring the stdout of the kafka.py file. If the Splunk process has some sort of failure, than likely you will stop ingesting data as well.

Indexing may still be going on if a separate Splunk System is handling your indexing, however if the Splunk system that has that modular input for kafka fails, than indexing won't be happening for your Kafka data. Other day may still be coming in though if indexing is happening on another system.

You can setup two different types of monitoring that might help detect these sorts of failures.

  1. Process monitoring outside of Splunk using something like Zabbix.
  2. You could also run some sort of scheduled job in Splunk to watch for lapses in data from Kafka.

markbatesplunk
New Member

Makes sense Ryan. thanks for your inputs

0 Karma

ppablo
Retired

Hi @markbatesplunk

Just to clarify for other users, but are you talking about modular inputs in general, or are you referring specifically to the Kafka Messaging Modular Input?
https://splunkbase.splunk.com/app/1817/

0 Karma

markbatesplunk
New Member

Correct ;O)

0 Karma
Get Updates on the Splunk Community!

.conf24 | Personalize your .conf experience with Learning Paths!

Personalize your .conf24 Experience Learning paths allow you to level up your skill sets and dive deeper ...

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...

Splunk APM: New Product Features + Community Office Hours Recap!

Howdy Splunk Community! Over the past few months, we’ve had a lot going on in the world of Splunk Application ...