Splunk Search

How to run a subearch within a specified time limit?

santhosh2kece
Engager

Hi,

I am running the below search query and get the error "[subsearch]: Subsearches of a real-time search run over all-time unless explicit time bounds are specified within the subsearch"

index="webproxylogs" [|inputlookup Blacklist_URLs.csv | rename Malicious_URL as cs_host | fields + cs_host |dedup cs_host |fields cs_host] NOT [|inputlookup Whitelist_URLs.csv | rename Non-Malicious_URL as cs_host | fields + cs_host |dedup cs_host |fields cs_host]

In my search, I am trying to get the list of internal hosts accessing the domains listed in blacklist_urls.csv and excluding the whitelist domains (like google, yahoo,etc.,) listed in Whitelist_URLs.csv . If the well known domains like google.com are accidentally added to blacklist_url.csv, they get excluded by the whitelist_urls.csv.

The above search query was giving me results, however in the recent past (last 45 days) this search query gives me the subseach time limit error mentioned above. Please help me to rectify this issue. Thanks

Tags (3)
0 Karma

MuS
SplunkTrust
SplunkTrust

Hi santhosh2kece,

That's only one limitation you can get if you're using subsearches.
I would simply setup an automatic lookup http://docs.splunk.com/Documentation/Splunk/6.2.0/Knowledge/Usefieldlookupstoaddinformationtoyoureve... to set the a new field called Malicious_URL= to yes for the bad URLs and Malicious_URL=no for the whitelisted ones.

This way you can search pretty easy for this:

index="webproxylogs" Malicious_URL="yes"

and all is good, no more problems related to any subsearch limits.

hope this helps ...

cheers, MuS

0 Karma

santhosh2kece
Engager

MuS,

I added the blacklisted URL.csv to Automatic lookups "Malicious_URL AS 1 Non_Malicious_Url AS 0 OUTPUTNEW". However when I give the search query

index="webproxylogs" Malicious_URL="yes"
I receive the following error Error 'Could not find all of the specified lookup fields in the lookup table.' for conf 'source::tcp:9998|host::nyc-proxy-2.bfm.com|bcoat_proxysg' and lookup table 'FSISAC_Malicious'.

Also in my query I used two csv files 1. blacklist_url.csv and 2. whitelist_url.csv. please let me know whether whitelist_url.csv should be ignored.

0 Karma

MuS
SplunkTrust
SplunkTrust

regarding the error, either you provided not enough or too many fields for the lookup.
Since I don't know your exact use case, I cannot answer this for you. Check if the use of the black-list is sufficient for you, if not setup a second auto lookup using the white-list as well.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...