Getting Data In

Why am I missing metrics.log data from some of my forwarders/hosts?

hettervik
Builder

Hi folks!

I've made a search that returns all hosts that sends events of some kind to indexer, but does not send internal metrics events.

index=_internal source=*metrics.log | dedup host | append [| metadata type="hosts"]

| stats count by host
| search count=1

| lookup dnsLookup host OUTPUT ip
| fields host, ip

The search first finds all hosts that sends metrics data, and then finds (append) all hosts that sends data at all with the metadata command. Every host that sends metrics data should show up two times, while the hosts that doesn't will only show up in the appended metadata search. Thus, if we count them, all hosts that only show up one time (count=1) does not send metrics data to my indexer.

My question is then; why do I have several forwarders/hosts showing up in this search? Given that my search is reasonable, is there any way to "activate" the logging of metrics data on these hosts?

0 Karma
1 Solution

renjith_nair
Legend

Probably those hosts are not sending data any more. Metadata doesn't depend on time range but index=_internal does.
So those hosts might have sent data earlier but stopped now. Can you increase your timerange to last 30 days or even "all time" and look for one of the hosts from the above list? or just run the below search and see when these hosts were last updated.

| metadata type=hosts | rename firstTime as "First Event" lastTime as "Last Event" recentTime as "Last Update" | fieldformat "First Event"=strftime('First Event', "%c") | fieldformat "Last Event"=strftime('Last Event', "%c") | fieldformat "Last Update"=strftime('Last Update', "%c")|table host "First Event" "Last Event" "Last Update"

Splunk should send it's internal data if the connection is established. It's quite interesting if it's forwarding other data but not internal

---
What goes around comes around. If it helps, hit it with Karma 🙂

View solution in original post

bamthauer
Explorer

I have the same issue: some universal forwarders don't send the metrics.log, although it is actual and splunkd.log is send properly.

Additionally I found several forwarders, not sending all data from the configured sources of TA-nix. A restart solves the problem. But then it happens again without any apparent reason (no high system load) that data of one source are not send anymore, although other sources are still send.

0 Karma

hettervik
Builder

Strange. If I remember correctly my problem was solved by making some adjustments on the deployment server. Apparently the servers I had that where not sending metrics data was not configured to do so in the inputs file.

0 Karma

bamthauer
Explorer

Indeed, found the reason: Seems that some older forwarder versions have indeed a different input configuration.

0 Karma

renjith_nair
Legend

Probably those hosts are not sending data any more. Metadata doesn't depend on time range but index=_internal does.
So those hosts might have sent data earlier but stopped now. Can you increase your timerange to last 30 days or even "all time" and look for one of the hosts from the above list? or just run the below search and see when these hosts were last updated.

| metadata type=hosts | rename firstTime as "First Event" lastTime as "Last Event" recentTime as "Last Update" | fieldformat "First Event"=strftime('First Event', "%c") | fieldformat "Last Event"=strftime('Last Event', "%c") | fieldformat "Last Update"=strftime('Last Update', "%c")|table host "First Event" "Last Event" "Last Update"

Splunk should send it's internal data if the connection is established. It's quite interesting if it's forwarding other data but not internal

---
What goes around comes around. If it helps, hit it with Karma 🙂

hettervik
Builder

Thanks! So metadata doesn't depend on time range, that explains a lot. It indeed looks like the hosts I found are not sending data, they haven't had an update in ages. Your search worked well for finding out so.

Having cleared that out, is there a way of seeing if a host is sending data, but not metric data? I know it sounds strange, but that's the task I have at hand.

0 Karma

renjith_nair
Legend

That's almost near to impossible. However, below search should give you some hint.

| metadata type=sourcetypes|stats count(eval(like(sourcetype,"splunkd%"))) AS internal,count(eval(NOT like(sourcetype,"splunkd%")))   AS others

Please mark as answer if you are happy with the answers

---
What goes around comes around. If it helps, hit it with Karma 🙂

hettervik
Builder

Thanks again! I'll check it out. 🙂

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...