All Apps and Add-ons

ghost forwader logs getting indexed

gibu_george_med
New Member

Hi All,

I'm using a trial license of hunk which has a daily index limit of 500Mb. I have noticed that this limit is getting used up entirely by the source /opt/splunkforwarder/var/log/splunk/splunkd_stdout.log, which is going to the index "main". This log is from the same machine that is running hunk.

The strange thing is that I do not have /opt/splunkforwarder/* on this machine nor do I do see this log file under "Files & directories" in "Data-Inputs". So where is this log file coming from and how do I stop it from getting indexed?

--gibu

0 Karma

gibu_george_med
New Member

Hi,

any updates ?
Is it possible to reinstall splunk on the same machine or try it out in another machine? Will this be a violation of the splunk trial license?

0 Karma

gibu_george_med
New Member

alt text

0 Karma

gibu_george_med
New Member

does not display any results

 index=* splunkd_stdout.log | stats count by host,source 
0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

Run these searches and show me the output:

Should display results:

index=_* splunkd_stdout.log | stats count by host,source

Probably will not display results:

index=* splunkd_stdout.log | stats count by host,source
0 Karma

gibu_george_med
New Member

There is also a file at

-rw------- 1 splunk splunk 0 Aug 29 18:09 /opt/splunk/var/log/splunk/splunkd_stdout.log

but last entry here is on August 29th

0 Karma

gibu_george_med
New Member

Running the command gives me this:

Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1473185832_1472541788_17/1473171734-1473107847-4310767278958693587.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1473185832_1472541788_17/1472934796-1472541788-4295240289056761388.tsidx matches
    7   source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log    1562157   1472558689  1473165985  1473165985  
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1473185832_1472541788_17/1473127638-1472915075-4307875633388093163.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1473185832_1472541788_17/1473185832-1473161845-4311692349811955989.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1473185832_1472541788_17/merged_lexicon.lex matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1473185832_1472541788_17/1473181574-1473152005-4311412142725860366.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473924786-1473924786-4361414439449264720.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473924726-1473924726-4361410507707283105.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473373407-1473166053-4325281208203107757.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473773351-1473373467-4351491069024689492.tsidx matches
    1   source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log    1940339   1473166053  1473924726  1473924726  
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473924666-1473923825-4361407434598118510.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473923765-1473919506-4361348448651943192.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473909426-1473773411-4360408773442212092.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_18/1473919446-1473909486-4361065335378366256.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472513715_1472033683_13/1472513319-1472488530-4267615600398307715.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472513715_1472033683_13/1472513715-1472493327-4267658847200076533.tsidx matches
    24  source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log    2184  1472483840  1472493860  1472493860  
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472513715_1472033683_13/1472508328-1472033683-4267664013340391307.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472513715_1472033683_13/merged_lexicon.lex matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472577080_1472349341_14/1472569643-1472493921-4271308427567352661.tsidx matches
    4   source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log   99881  1472494179  1472557273  1472557273  
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472577080_1472349341_14/1472576446-1472555082-4271752395862913981.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472577080_1472349341_14/1472574880-1472349341-4271888964310670199.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472577080_1472349341_14/merged_lexicon.lex matches
    Binary file /opt/splunk/var/lib/splunk/defaultdb/db/db_1472577080_1472349341_14/1472577080-1472556641-4271888888392539162.tsidx matches
    25  source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log  140 1472474874  1472493355  1472493531  
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_26/1473920406-1473598032-4361128199057228690.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_26/1473925029-1473925029-4361431280961092190.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_26/1473925026-1473920417-4361431282097119425.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_26/rawdata/218848661 matches
    1   source::/opt/splunk/var/log/splunk/splunkd_stdout.log    27 1471955304  1471955307  1471955307  
    1   source::/opt/splunk/var/log/splunk/splunkd_stdout.log    27 1472101479  1472101483  1472101483  
    19  source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log   42 1472023362  1472023397  1472028234  
    26  source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log   84 1472553563  1472555442  1472555443  
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_25/1473597380-1473587159-4339957824221576398.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_25/1473546168-1473382646-4336602485672984195.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_25/1473587150-1473546171-4339287790956830217.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_25/rawdata/353076297 matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/hot_v1_25/1473598031-1473597411-4340000353425231498.tsidx matches
    Binary file /opt/splunk/var/lib/splunk/_internaldb/db/db_1472989292_1472557300_23/merged_lexicon.lex matches
    19  source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log   56 1472068852  1472068948  1472068951

I get lots of line of the following:

09-14-2016 05:39:14.471 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=10.108148, eps=7.451544, kb=313.355469, ev=231, avg_age=0.000000, max_age=0
09-14-2016 05:40:16.471 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=10.108923, eps=7.451535, kb=313.379883, ev=231, avg_age=0.000000, max_age=0
09-14-2016 05:41:18.471 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=10.124065, eps=7.516113, kb=313.846680, ev=233, avg_age=0.000000, max_age=0
09-14-2016 05:42:20.471 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=10.261694, eps=7.580766, kb=318.107422, ev=235, avg_age=0.000000, max_age=0

09-06-2016 00:32:53.846 +0000 INFO  LicenseUsage - type=Usage s="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" st=splunkd_access h="splunk_test_machine" o="" idx="main" i="0B3D9939-35F1-4D7A-878F-346B4CAA010A" pool="auto_generated_pool_download-trial" b=194149 poolsz=524288000
09-06-2016 00:33:54.846 +0000 INFO  LicenseUsage - type=Usage s="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" st=splunkd_access h="splunk_test_machine" o="" idx="main" i="0B3D9939-35F1-4D7A-878F-346B4CAA010A" pool="auto_generated_pool_download-trial" b=194149 poolsz=524288000
09-06-2016 00:34:54.847 +0000 INFO  LicenseUsage - type=Usage s="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" st=splunkd_access h="splunk_test_machine" o="" idx="main" i="0B3D9939-35F1-4D7A-878F-346B4CAA010A" pool="auto_generated_pool_download-trial" b=194149 poolsz=524288000

09-08-2016 02:41:26.850 +0000 WARN  DateParserVerbose - A possible timestamp match (Wed Dec 20 17:35:10 2000) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log|host::splunk_test_machine|splunkd_access|\n               7 similar messages suppressed.  First occurred at: Thu Sep  8 02:35:26 2016
09-08-2016 02:42:26.848 +0000 WARN  DateParserVerbose - A possible timestamp match (Wed Dec 20 17:35:10 2000) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log|host::splunk_test_machine|splunkd_access|
09-08-2016 02:43:26.845 +0000 WARN  DateParserVerbose - A possible timestamp match (Wed Dec 20 17:35:10 2000) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log|host::splunk_test_machine|splunkd_access|
09-08-2016 02:44:26.842 +0000 WARN  DateParserVerbose - A possible timestamp match (Wed Dec 20 17:35:10 2000) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source::/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log|host::splunk_test_machine|splunkd_access|

09-13-2016 12:50:11.470 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=17.485462, eps=12.161542, kb=542.038086, ev=377, avg_age=0.000000, max_age=0
09-13-2016 12:51:13.470 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=17.486665, eps=12.161437, kb=542.080078, ev=377, avg_age=0.000000, max_age=0
09-13-2016 12:52:15.471 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=17.486335, eps=12.160945, kb=542.091797, ev=377, avg_age=0.000000, max_age=0
09-13-2016 12:53:17.471 +0000 INFO  Metrics - group=per_source_thruput, series="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log", kbps=17.491631, eps=12.161166, kb=542.246094, ev=377, avg_age=0.000000, max_age=0

09-13-2016 12:54:17.469 +0000 INFO  LicenseUsage - type=Usage s="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" st=splunkd_access h="splunk_test_machine" o="" idx="main" i="0B3D9939-35F1-4D7A-878F-346B4CAA010A" pool="auto_generated_pool_download-trial" b=555260 poolsz=524288000
09-13-2016 12:55:17.470 +0000 INFO  LicenseUsage - type=Usage s="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" st=splunkd_access h="splunk_test_machine" o="" idx="main" i="0B3D9939-35F1-4D7A-878F-346B4CAA010A" pool="auto_generated_pool_download-trial" b=552587 poolsz=524288000
09-13-2016 12:56:18.470 +0000 INFO  LicenseUsage - type=Usage s="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" st=splunkd_access h="splunk_test_machine" o="" idx="main" i="0B3D9939-35F1-4D7A-878F-346B4CAA010A" pool="auto_generated_pool_download-trial" b=552595 poolsz=524288000
09-13-2016 12:57:19.470 +0000 INFO  LicenseUsage - type=Usage s="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" st=splunkd_access h="splunk_test_machine" o="" idx="main" i="0B3D9939-35F1-4D7A-878F-346B4CAA010A" pool="auto_generated_pool_download-trial" b=555652 poolsz=524288000

Is it possible to reinstall splunk on the same machine or try it out in another machine? Will this be a violation of the splunk trial license?

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

What happens if you run the following command?

find / -type f -exec grep -i 'splunkd_stdout.log' {} \;
0 Karma

Raghav2384
Motivator

Hello @gibu_george_medlife,

First, The monitor/batch stanzas for these logs can reside default or local directories . $SPLUNK_HOME/etc/system/default|local OR $SPLUNK_HOME/etc/app/..local.

When you say same machine, how did you come to that conclusion? by looking at the host name? example:

index=main source="/opt/splunkforwarder/var/log/splunk/splunkd_stdout.log" gives the host name same as the hunk host? If yes, one possible reason i could think of is someone copied the stanzas as is. Example....if i have one server called Server1 with inputs and outputs and if i distribute the configs as is to Server2, unless someone goes in to inputs and changes the host name to $decideonstartup pr manually enter Server2, Logs will be displayed with host=Server1. If this is the case, try to find out IPs of all sourcenames to determine where/which server is sending these logs. Use this search on your hunk server to get the IPs

index="_internal" source="*metrics.log" group=tcpin_connections  | 
eval sourceHost=if(isnull(hostname), sourceHost,hostname) | 
eval connectionType=case(fwdType=="uf","Universal Forwarder", fwdType=="lwf", "Light Weight Forwarder",fwdType=="full", "Splunk Indexer", connectionType=="cooked" or connectionType=="cookedSSL","Splunk Forwarder", connectionType=="raw" or connectionType=="rawSSL","Legacy Forwarder") | 
eval build=if(isnull(build),"n/a",build) | 
eval version=if(isnull(version),"pre 4.2",version) | 
eval guid=if(isnull(guid),sourceHost,guid) | 
eval os=if(isnull(os),"n/a",os)| 
eval arch=if(isnull(arch),"n/a",arch) | 
eval my_splunk_server = splunk_server | 
fields connectionType sourceIp sourceHost sourcePort destPort kb tcp_eps tcp_Kprocessed tcp_KBps my_splunk_server build version os arch | 
eval lastReceived = if(kb>0, _time, null) | 
stats first(sourceIp) as sourceIp first(connectionType) as connectionType first(sourcePort) as sourcePort first(build) as build first(version) as version first(os) as os first(arch) as arch max(_time) as lastConnected max(lastReceived) as lastReceived sum(kb) as kb avg(tcp_eps) as avg_eps by sourceHost | 
stats first(sourceIp) as sourceIp first(connectionType) as connectionType first(sourcePort) as sourcePort first(build) as build first(version) as version first(os) as os first(arch) as arch max(lastConnected) as lastConnected max(lastReceived) as lastReceived first(kb) as KB first(avg_eps) as eps by sourceHost | 
eval status = if(isnull(KB) or lastConnected<(info_max_time-900),"missing",if(lastConnected>(lastReceived+300) or KB==0,"quiet","active")) | sort sourceHost

Hope this helps!

Thanks,
Raghav

0 Karma

gibu_george_med
New Member

Sadly, that did not help...I'm still seeing logs of splunkforwarder getting indexed.
I have blocked the ports 8089 and 9997 at the firewall so that there is no incoming connections , at these ports, for the machine running Splunk. And i also do not see any connections made with ports 8089 and 9997 when I run a netstat command, on this machine

I'm still not able to figure out where this log is coming from.

I;m not able to do a trail of Splunk because everyday it exceeds the daily indexing limit and blocks all searches. What is the way forward? Even after stopping this rouge file getting indexed, how long would I have to wait before I can start searching again?

--gibu

0 Karma

gibu_george_med
New Member

Hi @Raghav2384,

Thanks for the query, with that I was able to find out clients that where still forwarding data and have shutdown forwarders on those clients.

Hopefully this should stop more data getting indexed 🙂

--gibu

0 Karma

Raghav2384
Motivator

Excellent! Please accept it as an answer if it helped you and others can find it easily.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...