Getting Data In

scripted inputs and duplicate event data

himynamesdave
Contributor

Hi all.

I have built a simple scripted input that grabs XML data over http:

#!/bin/bash
curl http://www.a.com/EN.XML

All works fine BUT Splunk is indexing all events each time it is pinging the file, resulting in duplicate events.

What is the best way to validate the index of events in Splunk against the XML file, so that Splunk only pulls back events that have not already been indexed?

Thanks!

Tags (2)
0 Karma
1 Solution

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

View solution in original post

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

himynamesdave
Contributor

Thought so (was hoping I could cheat) 🙂

Thanks for your help!

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...