Getting Data In

How to index Kubernetes STDOUT data in Splunk?

dhavamanis
Builder

Need your help,

Can you please tell us, how to receive Kubernetes STDOUT data in Splunk Enterprise? Kubernetes is running on CoreOS.

Thank you,

0 Karma

pkisplunk
Explorer

We used Fluentd with Splunk cloud and it worked seamlessly.

If anyone using Splunk Cloud sees this answer - the methods above are applicable both to the Enterprise version as well as the Cloud.

mattymo
Splunk Employee
Splunk Employee

Hey dhavamanis,

We have released Splunk Connect for Kubernetes!

It used fluentd and heapster to get you logs metrics and metadata, and is Splunk built and supported!

Check it out!

https://github.com/splunk/splunk-connect-for-kubernetes

- MattyMo

abdulc
New Member

is there away to trim colors from fluentd-hec similar to what is suggested in https://github.com/mattheworiordan/fluent-plugin-color-stripper on a pod level ?

0 Karma

outcoldman
Communicator

We just published first version of our application "Monitoring Kubernetes" (https://splunkbase.splunk.com/app/3743/) and collector (https://www.outcoldsolutions.com). Please take a look on our manual how to get started https://www.outcoldsolutions.com/docs/monitoring-kubernetes/

0 Karma

agup006
Explorer

Hi Dhavamanis,

Fluentd is one of the preferred logging layers of Kubernetes, and using Fluentd is preferred for Kubernetes data routing to Splunk, Elasticsearch, Kafka, Amazon S3, etc. Using a Kubernetes Daemon Set you can deploy a Fluentd node inside of every Kubernetes Node and have the configuration to then route stdout, stderr, etc. data into Elasticsearch, Splunk, etc. Additionally, Fluentd has additional capabilities to append information about the Kubernetes Pod, Namespace, Node.

Documentation of Kubernetes Daemon Set: https://kubernetes.io/docs/concepts/workloads/controllers/daemonset/
Documentation on Fluentd Daemon Set:http://docs.fluentd.org/v0.12/articles/kubernetes-fluentd
Documentation on Fluentd-Elasticsearch Daemon Set:http://docs.fluentd.org/v0.12/articles/kubernetes-fluentd#logging-to-elasticsearch

If you want a supported Splunk plugin and a Kubernetes -> Splunk DaemonSet , Fluentd Enterprise offers SLA support for sending data to Splunk Enterprise and Splunk Cloud. If you want more information you can email me at A@ Treasuredata.com and find more information here: https://fluentd.treasuredata.com

Thanks,
Anurag

mattymo
Splunk Employee
Splunk Employee

what path did you end up on?

- MattyMo
0 Karma

MuS
Legend

Hi dhavamanis,

This is not a Splunk problem, but a Kubernetes problem ..... nevertheless a quick google search revealed this:

When a cluster is created, the standard output and standard error output of each container can be ingested using a Fluentd agent running on each node into either Google Cloud Logging or into Elasticsearch and viewed with Kibana.

From here https://github.com/kubernetes/kubernetes/blob/master/docs/getting-started-guides/logging.md

If you can get into ES/Kibana you can get it into Splunk 😉

Hope this helps and no I have no idea what Kubernetes is an cannot be of further help 🙂

cheers, MuS

0 Karma

vam111
New Member

What's the latest way to forward the K8s application (at containers in Pods) level logs to Splunk?

I want to understand, how the Pull-based method for data fetching from Google K8s cluster - container level can be configured for Splunk?

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...