Getting Data In

Universal forwarder stopped sending data after a reinstall

vikram_m
Path Finder

We were facing issue in Splunk log forwarding to IDXer cluster.

I found that our enterprise instance servers are 6.5.3 and UFs were of 6.6.2. So I uninstalled 6.6.2 version of UF and reinstalled 6.5.2 version on the same machine.

Then I did the similar configuration on the new UF. Now in the logs I can see UF is connected to Indexer but no data is been forwarded to the enterprise version.

I feel there is something I missed during the reinstallation.

Thanks.
Vikram.

0 Karma

gcusello
SplunkTrust
SplunkTrust

hi vikram_m,
let me understand:

  • your UFs are connected to Indexers and send internal logs to them (test it with index=_internal host=your_host | head 1000 using Always as Time Period),
  • there are monitoring logs,
  • monitoring logs don't arrive to Indexers (test it with index=* host=your_host | head 1000 ),
  • you correctly configured outputs.conf in UFs to link them to indexers (you can use the same of before)
  • you have in UFs all the required TAs (you can use the same of before) or inputs.conf in $SPLUNK_HOME/system/local,
  • now you don't receive monitoring logs.

Is it correct?

Bye.
Giuseppe

0 Karma

vikram_m
Path Finder

(1) your UFs are connected to Indexers and send internal logs to them (test it with index=_internal host=your_host | head 1000 using Always as Time Period) : This doesnot seem working.
(2) there are monitoring logs : These are also not working.
(3) monitoring logs don't arrive to Indexers (test it with index=* host=your_host | head 1000 ) : This command doesnot seem working.
(4) you correctly configured outputs.conf in UFs to link them to indexers (you can use the same of before) : YEs I did the same configuration as previous configuration.
(5) you have in UFs all the required TAs (you can use the same of before) or inputs.conf in $SPLUNK_HOME/system/local : We have inputs.conf on the system/local directory.....outputs.conf at etc/apps/ssl_indexer_app/local/ this app is pushed from deployment server we use for servers as well.
(6) After reinstall log reception totally stopped. However I can see in splunkd.log that UF is able to get connected to the indexers as per app from deployment server.

0 Karma

gcusello
SplunkTrust
SplunkTrust

At first, are you using SSL between UFs and Indexers? if yes check password!

let me understand: in splunkd.log, you see UF correctly connected to Indexers but internal logs aren't sent to Indexers, correct?

Perform only one final check before reinstall: check in $SPLUNK_HOME/system/local/server.conf and $SPLUNK_HOME/system/local/inputs.conf if hostname is correct;

If it's all correct, probably there's something dirty in your configuration, try to restart from the beginning:

  • stop splunk service (did you stopped Splunk service before downgrade?),
  • save outputs.conf, deploymentclient.conf and TAs in a different location,
  • delete Forwarder folder (in Linux) or remove App (in Windows),
  • install again forwarder,
  • connect to Deployment Server (/SPLUNK/HOME/bin/splunk set deploy-poll 🙂
  • check if TA containing outputs.conf is correctly deployed,
  • check if internal logs are sent,
  • check other TAs,
  • check if monitoring logs are sent.

Bye.
Giuseppe

0 Karma

koshyk
Super Champion

When you say same machine, is UF and Enterprise on the same machine? if yes , port conflict would be a problem.
Are you using deployment server to manage the UF? How do you establish the UF is connected to indexer?
Do you have outputs.conf configured to send data to indexers?
Can you please provide
- outputs.conf on your UF
- logs of Splunkd of your UF to see how it is connected
- You using TLS?

0 Karma

lfedak_splunk
Splunk Employee
Splunk Employee

Hey @vikram_m, if cusello solved your problem, remember to "√Accept" an answer to award karma points 🙂

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...