Splunk Dev

For the same type of event logs, why in uncommon cases, does Splunk not fully show all of the event log content?

rajer
New Member

I have set a search that looks through all logs in my nifi-app.log file where the header "Standard FlowFile Attributes" is seen. The actual nifi-app.log file displays all lines corresponding to instances where the "Standard FlowFile Attributes" header appears (approximately 16 lines). In Splunk, I notice in most cases, it also returns events where for each event, it is possible to see all 16 of these lines.

However, occasionally (e.g. approximately 5 out of every 100 events returned), I will see that some events only contain three lines of the 16 lines that should be visible. There is no way to see additional lines because there are no more additional lines to show in this uncommon cases. When I go back to verify there wasn't an issue in the nifi-app.log file, I find there wasn't one and that for the same event identified by date and timestamp in Splunk that only shows a portion of the total lines, the log it's meant to show from the nifi-app.log file shows all 16 lines.

I was thinking this could possibly have something to do with event breaking and potentially configuring the props.conf file in my /local directory, but I don't believe this is the case because if it was, then there wouldn't be any of the events that fully display all 16 lines.

It seems as if something is occurring where for the exact same type of events that are displayed in Splunk, in some cases, not all of the line content is shown and does not appear at all in these event logs.

Could someone let me know why this occurs?

Also, if any suggestions as to how to resolve this from occurring exist, I would greatly appreciate them as well.

Thanks in advance!

0 Karma

HiroshiSatoh
Champion

Do you compare it with the source and source type of the log that is not displayed?

How is field extraction done?
I think that the definition is bad if it is not displayed with the same field extraction.

0 Karma

rajer
New Member

I recently compared the source and source types between the events that display all of the expected lines and the events that do not.

In all cases, all of the events, whether they display all lines or not, have the same source.

There is no hard-line distinction between the source types. That is, the same sourcetype for the events that display all of the lines also matches the sourcetype of the events that do not display all of the lines.

Field extraction is done by specifying the index location, (i.e. index="someLocation"), host (i.e. host="someHost"), the phrase I am looking for that's common to these events ("specific phrase"), then I use NOT to filter out the events that do not display all lines.

example search to find all events that does display all lines from the original log sourcetype:
index=main host="some host name" "phrase common to all these types of specific logs" "a line that is only found in the events that display all of the lines"

example search to find all events that do not display all lines from the original log sourcetype:
index=main host="some host name" "phrase common to all these types of specific logs" NOT "a line that is only found in the events that display all of the lines"

Could you elaborate on what you meant you said the definition is "bad"? I do not currently have a lot of experience with Splunk and am have only recently started out using it.

Thanks

0 Karma

rajer
New Member

I believe a more accurate header question would be the following:

"For the exact same type of event logs, why in uncommon cases, does Splunk not fully record all of the event log content?"

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...