Getting Data In

Can the heavy-forwarder send to multiple receivers?

ufotech
Explorer

The 6.2.0 heavy-forwarder is configured to send everything to the indexer xyz. Now I want it to send specific events to a localhost:tcp-port in raw-format. These events are identified by a reg-ex e.g. "/relevant-Message/".
The 'relevant-message'-event is duplicated i.e. it is sent to the indexer & to the local tcp-port.

Can this be done?
Can you please give a config example for props, inputs and transforms on the heavy-forwarder?

inputs.conf

[monitor:///var/log/.../logsink*.log]
disabled = false
index = otex
sourcetype = otex

[monitor:///var/log/.../audit*.log]
disabled = false
index = otex_audit
sourcetype = otex_audit

[monitor:///var/log/.../*.log]
disabled = false
index = main
_blacklist = /(logsink|audit).*
sourcetype = main

props.conf

[default]
TRUNCATE = 250000

[main]
LINE_BREAKER = (\\n+|[\r\n])
BREAK_ONLY_BEFORE_DATE = false
SHOULD_LINEMERGE = false


[otex]
SEGMENTATION = full
SEGMENTATION-all = full
TRANSFORMS-otex-fields-1 = otex_fields_1
#split at "\n", \r or \n
LINE_BREAKER = (\\n+|[\r\n])
#merge lines that do not start with a Date
SHOULD_LINEMERGE = true
MAX_EVENTS = 1024
BREAK_ONLY_BEFORE_DATE = true
CHECK_METHOD = modtime
#date parsing
MAX_TIMESTAMP_LOOKAHEAD = 50
TIME_PREFIX = \[
TIME_FORMAT = %Y-%m-%d %H:%M:%S,%Q

outputs.conf

[tcpout]
defaultGroup = default-autolb-group

[tcpout:default-autolb-group]
server = ch123.net:9960

[tcpout-server://ch123.net:9960]

transforms.conf

[otex_fields_1]
REGEX = ^\[.*?\] \[(.*?)\] \[(.*?)\] \[(.*?)\] \[(.*?)\] \[(.*?)\] \[(.*?)\] \[(.*?)\] 
FORMAT = tid::"$1" hostname::"$2" component::"$3" instance::"$4" thread::"$5" level::"$6" logger::"$7" 
WRITE_META=true

Thanks!

Tags (2)
1 Solution

ufotech
Explorer

Cloning sample:
This would send events to two different splunk indexers on the same box (server1), listening on different ports.

outputs.conf

[tcpout]
defaultGroup = target1, target2

[tcpout:target1]
disabled = 0
server = server1.net:9960

[tcpout:target2]
disabled = 0
server = server1.net:29961

Only if you want to send specific events to a specific target:

inputs.conf

[monitor://path/file1.log]
_TCP_ROUTING = target1

[monitor://path/file2.log]
_TCP_ROUTING = target2

View solution in original post

ufotech
Explorer

Cloning sample:
This would send events to two different splunk indexers on the same box (server1), listening on different ports.

outputs.conf

[tcpout]
defaultGroup = target1, target2

[tcpout:target1]
disabled = 0
server = server1.net:9960

[tcpout:target2]
disabled = 0
server = server1.net:29961

Only if you want to send specific events to a specific target:

inputs.conf

[monitor://path/file1.log]
_TCP_ROUTING = target1

[monitor://path/file2.log]
_TCP_ROUTING = target2

aanataliya
Explorer

@ufotech : were you able to solve this problem? Can you please share your solution to benefit others?

0 Karma

ufotech
Explorer

i.e.

  • send events having this string: "relevant-message" to localhost:12345 in raw-format.
  • forward all events (including the above!) to ch123.net:9960 (as in the configs above)
0 Karma

ufotech
Explorer

Thank you indeed. I know there's a lot of docs around.
I have been reading these for days and was able to solve the most issues I had with upgrading a few outdated SPLUNK instances and moving from SyslogNG to heavy forwarders.

What I am asking for is the concrete statements in the respective config-files as I'm having dificulties to find out.

Thanks

0 Karma

somesoni2
Revered Legend

You would need to configure data routing so that specific data can be sent to specific Indexers. Read through below documentation page.

http://docs.splunk.com/Documentation/Splunk/6.2.0/Forwarding/Routeandfilterdatad

vasanthmss
Motivator

Read this,

Docs

Output.conf

V
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...