If you got any cheat sheet that I can setup kafka to send event to Splunk, that would help.
Hey@daniel,
You can try referring this link:
https://docs.splunk.com/Documentation/KafkaConnect/1.1.0/User/InstallSplunkKafkaConnect
I did that kafka setup before and below is based on centos 7.
yum upgrade
yum install java-1.8.0-openjdk
download kafka
Here is the splunkbase link but it will point you to Github
https://splunkbase.splunk.com/app/3862/
Once you downloaded kafka, untar it and rename the directory
tar -xzf kafka_2.11-2.1.0.tgz
mv kafka_2.11-2.0.0 kafka
Start kafka zookeeper and server
cd kafka
bin/zookeeper-server-start.sh config/zookeeper.properties > zookeeper.log &
bin/kafka-server-start.sh config/server.properties > kafka_server.log &
Create kafka topic
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
Below command can be used to list/delete topic as well.
bin/kafka-topics.sh --list --zookeeper localhost:2181
bin/kafka-topics.sh --zookeeper localhost:2181 --delete --topic test
Send some event to topic "test" so that you can get it from kafka connect later
bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
this is a test message
^C
Check the worker properties file. Make sure the following settings are set correctly.
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
Start Splunk-kafka-connect
bin/connect-distributed.sh config/connect-distributed.properties > Kafka_connect.log &
Create HEC connector in Splunk
Create splunk-kafka-connect task
curl http://localhost:8083/connectors -X POST -H "Content-Type: application/json" -d '{
"name": "test-single-event",
"config": {
"connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
"tasks.max": "1",
"topics":"test",
"splunk.sources": "test_kafka_event",
"splunk.indexes": "kafka_event",
"splunk.hec.uri": "https://localhost:8088",
"splunk.hec.token": "26faccb6-a0af-45d6-996e-7df97afb81fd",
"splunk.hec.raw": "false",
"splunk.hec.ack.enabled": "false",
"splunk.hec.ack.poll.interval": "10",
"splunk.hec.ack.poll.threads": "1",
"splunk.hec.ssl.validate.certs": "false",
"splunk.hec.http.keepalive": "true",
"splunk.hec.max.http.connection.per.channel": "4",
"splunk.hec.total.channels": "8",
"splunk.hec.max.batch.size": "500",
"splunk.hec.threads": "1",
"splunk.hec.event.timeout": "300",
"splunk.hec.socket.timeout": "120",
"splunk.hec.track.data": "true"
}
}’