Splunk Search

Excluding a field name from fields command exclusions

DUThibault
Contributor

The fields - command expects a list of field names to exclude, and one can use wildcards in that list. But what I need to exclude a long list of fields that match a wildcard expression except for one? Here's an example:

| datamodel Authentication Successful_Authentication search | search sourcetype=audittrail | table * | fields - Authentication.user api* autojoin buckets date_* enable_lookups extra_fields file_modify_time max* savedsearch_name search* splunk_server_group timestamp ttl vendor_status + date_zone | fieldsummary

This excludes a bunch of fields named date_hour, date_mday, date_minute and so forth, but say I want to keep date_zone? One (clumsy) solution I've found is to copy the field before running the exclusions and then copy it back:

| datamodel Authentication Successful_Authentication search | search sourcetype=audittrail | table * | eval datezone=date_zone | fields - Authentication.user api* autojoin buckets date_* enable_lookups extra_fields file_modify_time max* savedsearch_name search* splunk_server_group timestamp ttl vendor_status | eval date_zone = datezone | fields - datezone | fieldsummary

Is there a better way?

0 Karma
1 Solution

woodcock
Esteemed Legend

Like this:

... | rename foo1234 AS _foo1234
| fields - foo*
| rename _foo1234 AS foo1234

Fields that start with _ are not included in wildcard matches except for _*.

View solution in original post

woodcock
Esteemed Legend

Like this:

... | rename foo1234 AS _foo1234
| fields - foo*
| rename _foo1234 AS foo1234

Fields that start with _ are not included in wildcard matches except for _*.

DUThibault
Contributor

First mentioned by damien_chillet, this is by far the simplest approach. It would also be amenable to being made into a macro if I were so inclined.

0 Karma

woodcock
Esteemed Legend

I didn't notice that (I don't generally read the other comments, just the answers, when I am posting my answer), but there is big value in the _* portion that he did not mention, too. I upvoted his comment and you should do the same.

0 Karma

somesoni2
SplunkTrust
SplunkTrust

The name and number of default date_* fields are constant, so one option could be to create a macro, say date_fields_to_filter with definition as date_hour date_mday date_minute date_month date_second date_wday date_year. Your query will be like this

 | datamodel Authentication Successful_Authentication search | search sourcetype=audittrail | table * | fields - Authentication.user api* autojoin buckets `date_fields_to_filter` enable_lookups extra_fields file_modify_time max* savedsearch_name search* splunk_server_group timestamp ttl vendor_status | fieldsummary

DUThibault
Contributor

The macro solution is a good one, simpler than inputlookup. By the same token, it may be easier to do:

| fields + `date_fields_to_keep`

(Although I do realize that the two solutions are not equivalent when the events don't have a uniform field signature)

0 Karma

ddrillic
Ultra Champion

Maybe something in the spirit of @somesoni2 with a lookup at Search all fields except ____

0 Karma

DUThibault
Contributor

I tried:

| datamodel Authentication Successful_Authentication search | search sourcetype=audittrail | table * [|inputlookup myfieldstolookup.csv | eval query="searchTerm=".fieldName | table query | format ] | fieldsummary

where I used the Lookup Editor app to create myfieldstolookup.csv but I'm not sure what the contents of that file should be. I tried a file that looks like:

fieldName,fieldNameOut
Authentication.action,Authentication.action
...
tag::eventtype,tag::eventtype

Far from filtering on the 17 fields I want, it added 21 fields ( addr, auid, dev...) to the unfiltered 49-field search. The fragment:

[|inputlookup myfieldstolookup.csv | eval query="searchTerm=".fieldName | table query | format ]

returns

( ("searchTerm=Authentication.action") OR ... )

instead of the expected

( (searchTerm=Authentication.action) OR ... )

What am I doing wrong?

0 Karma

somesoni2
SplunkTrust
SplunkTrust

Your lookup should have only single column, say fieldName, with all the field names that you want to keep.

fieldName
 Authentication.action
 ...
 tag::eventtype,tag::eventtype

Now your search should be like this (see the rename and parameter to format command, that's the trick. Passing 6 empty double quote set separated by space)

 | datamodel Authentication Successful_Authentication search | search sourcetype=audittrail | table * [|inputlookup myfieldstolookup.csv | table fieldName | rename fieldName as query | format "" "" "" "" "" "" ] | fieldsummary
0 Karma

DUThibault
Contributor

http://docs.splunk.com/Documentation/Splunk/7.0.0/Knowledge/Usefieldlookupstoaddinformationtoyoureve... says "The table in the CSV file should have at least two columns", which is what initially led me to put two columns in there.

0 Karma

DUThibault
Contributor

It gets closer, but the fragment:

| inputlookup myfieldstolookup.csv | table fieldName | rename fieldName as query | format "" "" "" "" "" ""

now yields:

"Authentication.action" "Authentication.app" "Authentication.dest" "Authentication.src" "Authentication.src_user" "Authentication.tag" date_zone eventtype host index info linecount source sourcetype splunk_server "tag::action" "tag::eventtype"

which again fails to filter the fields. Looks like the quotes get added when "punctuation" (i.e. colons and periods) is present in the field names.

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

Try this

 [
 | inputlookup myfieldstolookup.csv 
 | table fieldName 
 | format "" "" "" "" "" "" 
 | rex mode=sed field=search "s/\"//g"
 ]
0 Karma

DUThibault
Contributor

That does get rid of the quotes in the fragment, but it turns out they're not the problem: the problem is very simply the '*' before the fragment, which voids the filtering being attempted. All that was needed was to change:

| datamodel Authentication Successful_Authentication search | search sourcetype=audittrail | table * [ | inputlookup myfieldstolookup.csv | table fieldName | rename fieldName as query | format "" "" "" "" "" "" ] | fieldsummary

into:

| datamodel Authentication Successful_Authentication search | search sourcetype=audittrail | table [ | inputlookup myfieldstolookup.csv | table fieldName | rename fieldName as query | format "" "" "" "" "" "" ] | fieldsummary
0 Karma

damien_chillet
Builder

I think the way you did it is already very smart 🙂

Eventually you could use rename instead of eval!

DUThibault
Contributor

Indeed, rename is one step better, allowing one to skip the second fields - call:

| from datamodel:Authentication.Successful_Authentication | search sourcetype=audittrail | table * | rename date_zone as datezone | fields - api* autojoin buckets date_* enable_lookups extra_fields file_modify_time max* savedsearch_name search* splunk_server_group timestamp ttl user vendor_status | rename datezone as date_zone | fieldsummary

However, I was hoping there was a way to use some sort of regex to replace the wildcard, that is to say, a way of specifying date_[^z].* instead of date_*.

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...