Getting Data In

Indexer Latency

hartfoml
Motivator

I have one heavy weight forwarder that is collecting from over 600 Universal Forwarder. I have syslog-ng installed on HW Forwarder collecting syslog to folder and reading folder with Splunk.

I am seeing (in SOS) an index latency of more than 1000 seconds on some of the indexes.
Where to look next for the cause is a problem????
How can i see if it is the indexers or my HW forwarder that is the bottle-neck.

I think it is the indexers but were to look to see where the issue is on the 3 indexers???

Tags (2)
0 Karma
1 Solution

hartfoml
Motivator

I found this article that talked about how to see the index time

http://splunk-base.splunk.com/answers/64388/is-_indextime-deprecated

I used the suggestion from Martin to right this search

index=firewall | eval diff = _indextime - _time | where diff > 0 | rename _indextime AS indxtime | rename _indextime AS indxtime | convert timeformat="%m/%d/%y %H:%M:%S" ctime(indxtime) | table _time indxtime diff

this alowed me to see the diff between the _time and the _indextime

Now I can set an alert if this number gets to high and to look for pasterns

View solution in original post

hartfoml
Motivator

I found this article that talked about how to see the index time

http://splunk-base.splunk.com/answers/64388/is-_indextime-deprecated

I used the suggestion from Martin to right this search

index=firewall | eval diff = _indextime - _time | where diff > 0 | rename _indextime AS indxtime | rename _indextime AS indxtime | convert timeformat="%m/%d/%y %H:%M:%S" ctime(indxtime) | table _time indxtime diff

this alowed me to see the diff between the _time and the _indextime

Now I can set an alert if this number gets to high and to look for pasterns

steve
Path Finder

This worked for me. Not sure why the table didn't like to display the raw indextime.

source=/var/log/* | eval time=_time | eval itime=_indextime | eval latency=(itime - time) | table time,itime,latency,_raw

martin_mueller
SplunkTrust
SplunkTrust

You could do something like this:

some search | eval diff = _indextime - _time | where diff > some value

and go from there to see patterns of offending sources, hosts, splunk_servers, whatever.

hartfoml
Motivator

Martin thanks for the help. I tried this search:
index=firewall | eval diff = _indextime - _time | where diff > 890 | table _indextime _time

I could not find the _indextime and the search would only find all results or no results as i increased the greater than number. I wanted to see the diff between the index time and the time so I added the table command but all that showed was the time not the indextime. Do you know how I can see the "index time"?

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...