Getting Data In

Monitoring Cumulative Dumps

Ron_Naken
Splunk Employee
Splunk Employee

When monitoring an EMC Clarion, the CLI tool to dump the logs simply dumps all logs from the device, including any previously exported logs from the previous run. We intend to run the tool every hour and have the tool dump the logs to the same file. So each hour, the file will be overwritten with it's current data, plus any new logs. For instance:

Hour 1 (/test/log.txt): event1 Hour 2 (overwritten /test/log.txt): event1 event2 Hour 3 (overwritten /test/log.txt): event1 event2 event3

What is the recommended way to index this file?

Tags (1)
1 Solution

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

View solution in original post

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...