All Apps and Add-ons

How would the Kafka Messaging Modular Input behave if a Splunk failure occurs?

markbatesplunk
New Member

Hi,

I have a question from our SPLUNK team. We intend on using the Kafka Messaging Modular Input to ingest Kafka events. If a SPLUNK failure occurs, how would the modular input behave? The events we are consuming are being used to reconcile an end to end process, so we need to know if the modular input would continue to ingest events from KAFKA and then fail when it tried to index those events, or would it be aware of the problem with Splunk and stop ingesting Kafka events?

0 Karma
1 Solution

ryanoconnor
Builder

Part of the answer to this question may depend on how your deployment is configured.

The modular input for this TA is part of Splunk. The modular input in this case is monitoring the stdout of the kafka.py file. If the Splunk process has some sort of failure, than likely you will stop ingesting data as well.

Indexing may still be going on if a separate Splunk System is handling your indexing, however if the Splunk system that has that modular input for kafka fails, than indexing won't be happening for your Kafka data. Other day may still be coming in though if indexing is happening on another system.

You can setup two different types of monitoring that might help detect these sorts of failures.

  1. Process monitoring outside of Splunk using something like Zabbix.
  2. You could also run some sort of scheduled job in Splunk to watch for lapses in data from Kafka.

View solution in original post

ryanoconnor
Builder

Part of the answer to this question may depend on how your deployment is configured.

The modular input for this TA is part of Splunk. The modular input in this case is monitoring the stdout of the kafka.py file. If the Splunk process has some sort of failure, than likely you will stop ingesting data as well.

Indexing may still be going on if a separate Splunk System is handling your indexing, however if the Splunk system that has that modular input for kafka fails, than indexing won't be happening for your Kafka data. Other day may still be coming in though if indexing is happening on another system.

You can setup two different types of monitoring that might help detect these sorts of failures.

  1. Process monitoring outside of Splunk using something like Zabbix.
  2. You could also run some sort of scheduled job in Splunk to watch for lapses in data from Kafka.

markbatesplunk
New Member

Makes sense Ryan. thanks for your inputs

0 Karma

ppablo
Retired

Hi @markbatesplunk

Just to clarify for other users, but are you talking about modular inputs in general, or are you referring specifically to the Kafka Messaging Modular Input?
https://splunkbase.splunk.com/app/1817/

0 Karma

markbatesplunk
New Member

Correct ;O)

0 Karma
Get Updates on the Splunk Community!

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...