Getting Data In

How to handle a daily changing CSV file and avoid indexing duplicate events/rows?

HeinzWaescher
Motivator

Hi,

I have daily growing CSV file that I want to index. Just importing it every day would end up in a lot a duplicate events. I've read about the followTail option, but also that this option is not recommended. How I can avoid duplicate events? My first thought was to create a daily scheduled search to delete all "old" files and only keep the last indexed file, but I hope there is a better possibility.

Cheers
Heinz

Tags (3)
0 Karma

jaredlaney
Contributor

Maybe you should consider looking at the kv store. I believe it has an upsert capability through a RESTful interface.

http://docs.splunk.com/Documentation/Splunk/6.2.5/Admin/AboutKVstore

thirumalreddyb
Communicator

If below is your case:

Sep,1,2015:
If you have ten records.
Sep,2,2015:
If you have ten+7 records. (7 new records and 10 records regenerated and are same as the previous file had)
Sep,3,2015:
You have 25 records in total. (10 from Sep,1,2015. 7 from Sep,2,2015 and 8 new records from Sep,3,2015)

Then, you can choose to monitor the file continuously (while indexing) and make sure you copy paste all the data from the new file into old file (if you want to do it manually). This way you don't have duplicate records.

If your scenario is different, then just use index=something sourcetype=csv source=path/filename.csv | dedup _raw | your analysis code

Hope this is helpful for you.

HeinzWaescher
Motivator

| dedup _raw is good a first workaround

thanks!

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...