@yongyuthvis This is something what we have done a year ago. Could you please let me know whether you are using TLS 1.2 or something else ? Also, you need to check with you Kafka team whether it is current available to make a successful connection and forward the data to all products. If not yet you could something that need to make it work with Kafka team (FYI... this is only if you are using Kafka to forward the data to multiple applications in your organization like Splunk, ELK, etc).
Once you are good with these, download the Splunk connect for kafka : https://splunkbase.splunk.com/app/3862/ and update the required configurations based up on the requirement shown by Splunk in the docs: https://docs.splunk.com/Documentation/KafkaConnect/latest/User/About Make sure to generate the Splunk HEC token to accept the incoming data using this token from Kafka bus. After you have done this you need to start the Kafka broker and server on Kafka Side and execute the command which is provided by Splunk in the above doc. That will start forwarding the data to Splunk HF and from there processing will happen at HF level , then sends to Splunk indexers.
Prior to execution of the Splunk commands or starting the Kafka servers, make sure to use the certs based up on your org requirements that something like self-signed or kerberos. Check with your Kafka team.
You might need to execute that data forwarding based up on the Kafka Topic every time you have a new topic created. I have used Ansible to automate the process of identifying the new topic and execution of the command. You can do this with any other automation as well. Please do accept the answer if you like it and this something that helps your scenario. Thanks.
... View more