Splunk Search

Real-time searches not keeping up

a212830
Champion

Hi,

I recently noticed that my real-time searches are not "keeping up". For example, if I show a 5-minute window, it will be behind about 3 minutes, but if I do the same search, and just look for the last 15 minutes, the missing events appear. Any suggestions on what could be causing this?

I noticed the following on one of my heavy forwarders - I'm assuming the blocks aren't a good thing and could be related?

metrics.log:08-08-2013 16:50:50.252 -0400 INFO Metrics - group=queue, name=aeq, blocked=true, max_size_kb=500, current_size_kb=499, current_size=61, largest_size=61, smallest_size=0
metrics.log:08-08-2013 16:50:50.252 -0400 INFO Metrics - group=queue, name=indexqueue, blocked=true, max_size_kb=500, current_size_kb=499, current_size=612, largest_size=705, smallest_size=0
metrics.log:08-08-2013 16:50:50.252 -0400 INFO Metrics - group=queue, name=typingqueue, blocked=true, max_size_kb=500, current_size_kb=499, current_size=659, largest_size=703, smallest_size=198
metrics.log:08-08-2013 16:51:21.253 -0400 INFO Metrics - group=queue, name=aeq, blocked=true, max_size_kb=500, current_size_kb=499, current_size=61, largest_size=61, smallest_size=0

0 Karma

jtacy
Builder

I agree with you that there could be some queuing/blocking causing this; the fact that the "last 15 minutes" searches look OK might be a red herring. You might check out Troubleshooting Blocked Queues. The big question will probably be whether the queuing is happening at the HWF itself or the indexers to which it is load balancing; the Indexing > Indexing Performance component of Splunk on Splunk might be helpful to visualize that.

If your HWFs are version 5.0.2 or later, they should automatically be forwarding their _internal index and you can manually add them to the S.o.S server list so you can visualize them too. You can follow step 4 (other steps not needed) of the accepted answer of this question to do so.

Good luck! S.o.S won't show resource usage information unless you also install the technology add-on but I'd make sure to check the host machine metrics as well, especially if these are small boxes like VMs.

HiroshiSatoh
Champion

Thank you, jtacy! My reply was unkind.

0 Karma

HiroshiSatoh
Champion

I think that there is only the environment of your answer. Please find a causal relationship between forwarders by utilizing Splunk. Also, I think "Search job inspector" and give us a hint.

0 Karma

linu1988
Champion

It waits for the event to arrive at the indexer then the search should go on. There is no problem with the realtime search. Check the interval you have set. To explain your second case the events are present if you give a 15 minutes, it will obviously show up. As soon as the events arrive the realtime search will show them, until than it will simply keep running..

0 Karma

HiroshiSatoh
Champion

I also think "heavy forwarders" is has impacted. However, it is not possible to demonstrate it in my environment. I think can demonstrate it with just your environment.

0 Karma

a212830
Champion

Not sure what you are asking. The search was basic - a real-time search for an index - can't be much simpler.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...