Getting Data In

Best ways to monitor a clustered system's data file?

Runals
Motivator

I have a situation where two systems will write to the same NFS mounted file based on whichever one is active. I'm trying to figure out the best approach to importing that data. If I put a local agent on both boxes I'm guessing both agents will read the log file independently and I will get duplicate events in Splunk. The other end of the spectrum is I install the agent on just one box and figure that in cases where that box is rebooted it should still have access to the file (even if it isn't the primary box at that point) and it will simply pickup where it left off indexing the file.

Has anyone dealt with this sort of situation before and if so I'd be interested in hearing how you addressed it.

Tags (2)
0 Karma

anantdeshpande
Path Finder

I have same use case and looking for solution. Runals, do you have solution now?

0 Karma

Runals
Motivator

Man that was a long time ago. I think we ended up just pulling in the data from one of the 2 systems. Not pretty but /shrug.

0 Karma

mikelanghorst
Motivator

If you're using some sort of cluster software, you could also have a 3rd install. One on each host, with the third install being configured to only run from the "primary" node.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...