I have say 20 containers how do i forward the individual containers logs to Splunk ,
do i need to install forwarders in each of the containers, going forward if the no of containers increase is it the same way i do.
Any best practices available for the same.
Regards,
Pradipta
For a small number of containers the individual UF solution is functional
For a highly scalable option, go with @acharlieh's answer or see below.
[Revision of previous answer]
I would highly recommend sending the logs off the containers to a central system and monitoring that via a universal forwarder. You could even send the container logs to a Syslog Server and simply monitor the receiving server for the incoming logs with a universal forwarder. I believe this presents the scalability you are looking for.
For a small number of containers the individual UF solution is functional
For a highly scalable option, go with @acharlieh's answer or see below.
[Revision of previous answer]
I would highly recommend sending the logs off the containers to a central system and monitoring that via a universal forwarder. You could even send the container logs to a Syslog Server and simply monitor the receiving server for the incoming logs with a universal forwarder. I believe this presents the scalability you are looking for.
Thanks for the reply and clearing my doubts. In case you have any ready reference how to send container logs to a Syslog Server, would be helpful.
Regards,
Pradipta
It depends on your container OS.
For RHEL/CentOS:
http://linuxsysconfig.com/2013/04/how-to-configure-remote-logging-on-rhel6-centos6/
I would argue with grimlock and say that while adding a UF within each container while giving the most control, it could be considered a sub-optimal solution as it would increase the size and requirememnts for running each container. (But it depends on your designs and use cases of course)
Other options for capturing containerized logs include: you could use a UF/HWF on the container host to monitor the captured stdout files and possibly files within all running containers (if you have a lot of control over the formats printing to stdout header mode may be interesting here). I think I even vaguely remember @dmaislin_splunk used logstash to interact with the container host and dynamically pull different files out of each container to send over to splunk, but I'm not sure of the exact mechanics of that it's been a while...
Alternatively / additionally you could have your containers or some of your container logging systems push data to a (cluster of) HTTP Event Collector (HEC). Docker and others have loghing drivers for this.
Thanks for the reply can you share any reference link to the solutions that you have shared
Regards,
Pradipta
Best practice would be to install a forwarder on each container, yes.
Simply add the log folder to the monitor on the forwarder or configure a deployment server to deliver the appropriate app.
Please see the following links for command line implementations of forwarder deployments:
Configuring the forwarders:
http://docs.splunk.com/Documentation/Forwarder/6.6.0/Forwarder/Configuretheuniversalforwarder
Setting the deployment server:
https://docs.splunk.com/Documentation/Splunk/6.6.0/Updating/Configuredeploymentclients
Configure and set up deployment server:
http://docs.splunk.com/Documentation/Splunk/6.6.0/Updating/Createdeploymentapps
Hope this helps
Its fine when i have a small no of containers but if going forward i have 1000 containers then it would be very difficult is there any other way , give your inputs.
Regards,
Pradipta
Please see revised answer at the bottom reflecting response converted to answer.