Splunk Search

I see a list of generic names under "savedsearch_name" on search heads, but where are these searches actually saved?

tkwaller
Builder

Hello

On my search heads, I am able to find searches that are named "search1", "search2" etc:

savedsearch_name
search1
search10
search11
search12
search13
search14
search15
search16
search17
search18

However, the actual searches change. For example, yesterday search1 had a litsearch that was different than search1 today.

My question is this:
How can I determine where these are actually saved or are they? I cannot find any savedsearches named any of these.

eavent_splunk
Splunk Employee
Splunk Employee

Introspection data can help here - search the found sid (search ID) in the _introspection index to get a bit more info on these searches and where they come from.
Here's a quick and dirty search:

index=_introspection sourcetype=splunk_resource_usage data.process_type=search data.search_props.sid=*<SID>* | transaction data.search_props.sid | table data.search_props.*,duration

The provenance and app fields can help you identify where the searches are running.

TheWoodRanger
Explorer

Got a random notification for this thread and realized I knew more than I did then..

You're absolutely correct here. The `provenance` field gathered from _introspection enables you to fit where the search was initiated from! Here's a generalized introspection search for everyone:

index=_introspection sourcetype=splunk_resource_usage component=PerProcess data.search_props.sid::*
data.search_props.role="head"
| fields _time, data.*, host
| eval label = if(isnotnull('data.search_props.label'), 'data.search_props.label', "")
| eval provenance = if(isnotnull('data.search_props.provenance'), 'data.search_props.provenance', "unknown")  
| eval read_mb = 'data.read_mb'
| eval written_mb = 'data.written_mb'
| eval process = 'data.process'
| eval pid = 'data.pid'
| eval elapsed = 'data.elapsed'
| eval mem_used = 'data.mem_used'
| eval mem = 'data.mem'
| eval pct_memory = 'data.pct_memory'
| eval pct_cpu = 'data.pct_cpu'
| eval sid = 'data.search_props.sid'
| eval app = 'data.search_props.app'
| eval label = 'data.search_props.label'
| eval type = 'data.search_props.type'
| eval mode = 'data.search_props.mode'
| eval user = 'data.search_props.user'
| eval role = 'data.search_props.role'
| fields _time, host, process, pid, elapsed, mem_used, mem, pct_memory, pct_cpu, sid, app, label, type, mode, user, label, provenance, read_mb, written_mb
| stats latest(*) AS * BY sid
0 Karma

TheWoodRanger
Explorer

After experiencing this problem myself, I discovered the answer. When a dashboard, written in SimpleXML format, invokes a tag without an id value specified, Splunk automatically uses this "search5"/"search6"/"search7" default string value as the name of the search when logging in _audit. I don't know what factors into the actual different in numbers, but when I added id values to a dashboard that didn't have them, the new ID's showed up in the logging in the same place that the "search5" type values did.

I don't have an answer for how to appropriately deal with the problem that this results in, namely that you can't differentiate searches that are all falling under these default values. It's rough to consider trying to force everyone that created dashboards to assign ID values to all of their searches.

Splunk!!! Give us a way to break out these searches more effectively, namely even include the dashboard view they're being run in so we have a way of isolating them! Answer/guidance needed.

woodcock
Esteemed Legend

Maybe your answer should be modified to be a comment so that this question shows unanswered and maybe will get more notice.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...