Getting Data In

RRD Data into Splunk

technick
Explorer

I need some help getting RRD data entered into splunk.

Data Example:

1340655780: 9.2189559556e+05 2.6145535333e+06

1340658000: 9.8897729778e+05 2.3643315422e+06

1340660220: 6.0333832000e+05 2.3522330178e+06

1340662440: 1.5102271111e+06 2.6492235911e+06

The first column is epoch time, second is network octets out, third is network octets in.

Whats the best way for getting this data into splunk and graphing it?

Thanks in advance for your help.

Tags (4)
0 Karma

DUThibault
Contributor

Another approach, particularly when collectd is monitoring a remote system on which a Splunk Universal Forwarder is installed, would be to select the CSV output plugin, and then have the forwarder monitor the selected DataDir,Another approach, particularly if collectd is monitoring a remote system with a Splunk Universal Forwarder on it, would be to select the CSV plugin for collectd output. The csv output directory ( DataDir setting) would then be monitored by the forwarder.

0 Karma

smitra_splunk
Splunk Employee
Splunk Employee

I'm facing a unique issue with collectd 5.8.1 on CentOS 7.6. Some of the collectd metrics fields in the JSON have no values and/or have no field/attributes names. This causing Splunk to reject all events and not ingest at all , citing errors like below.
search peer idx-xyzmydomain.com has the following message: Metric value= is not valid for source=collectd_hec_token, sourcetype=httpevent, host=aa.xx.yy.zz, index=linux_metrics. Metric event data with an invalid metric value would not be indexed. Ensure the input metric data is not malformed.

I've spent time trying to make head or tail of it but need alternate way to ingest metrics but parsing selected fields from collectd generated csv file data. @DUThibault , could you please share the details of your implementation ?

0 Karma

DUThibault
Contributor

@smitra_splunk We faced a number of constraints that did not allow use of JSON as a transmission format; the older collectd we used also limited the plug-ins we could use, which meant a few data streams would be missing from those expected by the Splunk Add-on for Linux. This second constraint is of course not a problem if you're doing your own analysis of the data streams. We were also unable to use collectd's write_graphite plug-in. We ended up using collectd's write_csv to "log" the data locally, combined with a Universal Forwarder that processed the logs and sent their events in simulated linux:collectd:graphite sourcetype.

The Universal Forwarder uses a network connection to send its data, very much like write_http does, but offers several advantages despite its light footprint: it can tag metadata; it buffers, compresses and secures the data transfers; it can consolidate data; it can handle index-time transformations; and it can even do load balancing (when its data are being consumed by several Splunk indexers).

Now, your problem seems to be that collectd is sending empty JSON fields, so my first thought would be to check the collectd configuration. The transmission mode (HEC vs. http vs. TCP vs. UDP) is extremely unlikely to be at fault here. Which collectd plug-ins are you using?

0 Karma

sbrant_splunk
Splunk Employee
Splunk Employee

Your best bet is to utilize either rrddump (http://oss.oetiker.ch/rrdtool/doc/rrddump.en.html) or rrdxport (http://oss.oetiker.ch/rrdtool/doc/rrdxport.en.html) to write the data out to a text file (XML), which Splunk can then easily monitor and ingest.

technick
Explorer

What would my source type be for this xml feed?

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...