Reporting

600+ saved searches piped to monitoring

bensbrowning
Explorer

Hey guys,

We have a modest Splunk deployment (a few hundred forwarders, 4 indexers, 2 search heads, deployment server) and are taking in around 60G per day (~1 million events / 5 minutes). We'd like to be searching these for specific regexps and, if they hit, kicking them off to a script that injects alarms into Zabbix, our monitoring platform.

What I do not want (and assume cannot do) is to set up 600 realtime searches and have them running 24/7. What I especially do not want is to cripple Splunk for the sake of this monitoring.

What I'd like to do is pull 1 realtime search that goes through a table or somesuch and checks each line against it, kicking it to the script if it hits (with an alarm name, so like alarm name = foo if /foo foo bar/) and passing through otherwise. Does anyone know a good way to do this?

Thanks!

~Ben

lguinn2
Legend

This idea is very attractive, but I can't think of a way to test for 600+ conditions that would actually perform fast enough to be effective as a real time search.

Can you group your 600 searches in some way? For example, grouping them by source, sourcetype or host?

I could see running a few real-time searches, and some scheduled searches that run every 5 minutes, etc. Only the critical few items probably deserve a realtime search anyway. Each realtime search consumes a CPU core, so if you want a dozen of them to run at once, you probably should set up a separate search head just to run your alerting searches.

If you group your searches, then you could probably use lookup tables to further match and refine your alerts.

bensbrowning
Explorer

There's not a clean way to group them into significantly smaller searches. I'm not tied to "real time search" per se- a 5 minute gap would be fine. Still a daunting task.

Yeah, that's about what I was thinking. I was hoping there was some way to do this cleanly, but now I'm starting to think we may need to explore workarounds.

0 Karma

HansWurscht
Path Finder

Hi,

did you manage to solve your problem in a good way?
I'm facing a similar problem now in our deployment.

Thanks!

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...