Greetings,
I have 2 servers that suddenly stopped sending data to the indexer. I am struggling to find the root cause. I can telnet to the indexer from the forwarder just fine.
Here is the outputs.conf
[tcpout]
defaultGroup = default
disabled = false
[tcpout:default]
compressed = true
server = 10.x.x.x:9997
sslCertPath = $SPLUNK_HOME/etc/auth/server.pem
sslPassword = $1$wUgcTqWznVA=
sslRootCAPath = $SPLUNK_HOME/etc/auth/cacert.pem
sslVerifyServerCert = false
Here is the inputs.conf
[default]
host = xxxx
[SSL]
password = $1$PK3DT9mO4713
serverCert = /opt/splunk/etc/auth/server.pem
rootCA = /opt/splunk/etc/auth/cacert.pem
I currently have SSL turned off under server.conf
[general]
guid = xxxxx
serverName = xxxxx
[lmpool:auto_generated_pool_download-trial]
description = auto_generated_pool_download-trial
quota = MAX
slaves = *
stack_id = download-trial
[lmpool:auto_generated_pool_forwarder]
description = auto_generated_pool_forwarder
quota = MAX
slaves = *
stack_id = forwarder
[lmpool:auto_generated_pool_free]
description = auto_generated_pool_free
quota = MAX
slaves = *
stack_id = free
[lmpool:auto_generated_pool_enterprise]
description = auto_generated_pool_enterprise
quota = MAX
slaves = *
stack_id = enterprise
[license]
active_group = Enterprise
[sslConfig]
enableSplunkdSSL = false
sslKeysfilePassword = $1$eOiFDozCt+53
Other
The strange thing is, I have mimicked configuration from other servers that are forwarding traffic just fine. I have 2 that will not send any. The logs are not full of errors.
I took over splunk just recently so still very new to all of this.
Starting splunk in debug, I notice the following that looks odd.
05-31-2012 18:06:57.353 DEBUG TcpOutputProc - Cannot find any valid descriptors when looking for new indexer.
05-31-2012 18:06:57.353 DEBUG TcpOutputProc - Looking for indexer...
05-31-2012 18:06:57.353 DEBUG TcpOutputProc - Connection not available. Waiting for connection ...
Does any one have any insight??
... View more