All Apps and Add-ons

Why does Indexer sometimes not classify AWS Guardduty data as proper source?

emcbolton
New Member

We are using HTTP Event Collector (HEC) to ingest AWS Guardduty Cloudwatch Events via AWS Kinesis Firehose. I have worked with a Splunk SE to get this working on two environments. One environment (dev license) is a single instance environment. The other environment is Distributed with a HEC instance separate from the Indexer. In both environments, the Indexer appears to not always classify the incoming data from Firehose as aws:cloudwatch:guardduty properly. The unclassified data doesn't end up in the Splunk Dashboard for Guardduty. What I mean by this, is that it will sometimes have the clean Splunk formatting (plus signs and all) and other times be multiple Guardduty events in one text blob (no Splunk formatting, just long string of text).

Is there a way to fix this, alert on this issue from Splunk, or have Splunk re-read the data?
Does this likely mean I need a Lambda function to break these into separate events (maybe it's a CloudWatch to Firehose issue) prior to being sent to Firehose?

Tags (1)
0 Karma
Get Updates on the Splunk Community!

Detecting Remote Code Executions With the Splunk Threat Research Team

WATCH NOWRemote code execution (RCE) vulnerabilities pose a significant risk to organizations. If exploited, ...

Enter the Splunk Community Dashboard Challenge for Your Chance to Win!

The Splunk Community Dashboard Challenge is underway! This is your chance to showcase your skills in creating ...

.conf24 | Session Scheduler is Live!!

.conf24 is happening June 11 - 14 in Las Vegas, and we are thrilled to announce that the conference catalog ...