Getting Data In

second instance of a heavy forwarder on the same system (UFW not able to connect)

Elsurion
Communicator

Hi all

I have a functional heavy forwarder on a systems, now i want a second heavy forwarder on the same system. I'd like to test some limiting features in the actual data stream. I cannot move it to another place, since i need some data throughput, which i don't have in any other non productive environment. And before i setup a new system, i like to take and utilize some installed hardware.

The setup of the second heavyforwarder worked well, including the binding of the second inputport as well the other second ports.
The port is open but the Universal Forwarder isn't able to open a stable connection to the second heavy forwarder.

This is the outputs.conf

splunk@mysystem:default $ cat outputs.conf
[tcpout]
# do not index locally
indexAndForward = false
# forward all loca indexes
forwardedindex.filter.disable = true
useACK = true
defaultGroup = splunk

[tcpout:splunk]
server = splunk-indexer01:9997

[tcpout:splunk-hfw]
server = splunk-hfw:9997

[tcpout:splunk-sink]
server = splunk-hfw:9996

The connection tests worked as well.

splunk@mysystem:default $  telnet splunk-hfw 9996
Trying <someip>...
Connected to splunk-hfw.
Escape character is '^]'.
^C
splunk@mysystem:default $

In the spunkd.log on either system i don't see anything which hitting me to the problem i have here...
On the UFW i'm getting time-outs

02-19-2018 13:42:00.425 +0100 WARN  TcpOutputProc - Cooked connection to ip=<hfwip>:9996 timed out
02-19-2018 13:42:00.425 +0100 WARN  TcpOutputProc - Cooked connection to ip=<hfwip>:9996 timed out

On the HFW i'm getting broken links, but i don't know why.

02-19-2018 13:42:16.849 +0100 ERROR TcpInputProc - Error encountered for connection from src=<ufw>:51891. Broken pipe

Traffic on the normal :9997 Port works without any issue
Anyone has any hint for the problem?

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...