All Apps and Add-ons

Query - SQS based S3 Input

chintu_jain
Explorer

In AWS add on for splunk, if i create an cloudtrail input using SQS based S3, does splunk cleans up the queue of messages after it processes the file.

Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?

Tags (1)
0 Karma

ameyap16
Engager

Yes, the Splunk Heavy Forwarders delete the messages from the SQS queue once the messages are read. Of course you have to provide the required permissions to the Splunk user in AWS.

"Can i have two forwarders where identical inputs is created for the same SQS queue, but sending data to different indexes?"
If you have 2 HFs with identical inputs, the traffic will be distributed by them. This essentially makes the processing of the messages and the ingestion faster and is very useful for ingesting from large S3 buckets. However, I am not sure that will solve your purpose.
In order to send all the S3 data to 2 different indexes, you should have 2 SQS queues with the same messages. You can do this by having 2 SQS queues subscribed to the same SNS which get the notification from the S3 bucket.

0 Karma
Get Updates on the Splunk Community!

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...