Getting Data In

Monitoring Cumulative Dumps

Ron_Naken
Splunk Employee
Splunk Employee

When monitoring an EMC Clarion, the CLI tool to dump the logs simply dumps all logs from the device, including any previously exported logs from the previous run. We intend to run the tool every hour and have the tool dump the logs to the same file. So each hour, the file will be overwritten with it's current data, plus any new logs. For instance:

Hour 1 (/test/log.txt): event1 Hour 2 (overwritten /test/log.txt): event1 event2 Hour 3 (overwritten /test/log.txt): event1 event2 event3

What is the recommended way to index this file?

Tags (1)
1 Solution

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

View solution in original post

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...