Monitoring Splunk

Will there be performance issues using KV Store with a large data set?

nawneel
Communicator

Hi all

I have a large data set (20 million) since 2015 which keeps on growing. In my case, I am supposed to use lookup and I found out that KV store is best since records in index are getting updated with _key(ORDER_KEY) remaining constant, hence my lookup will also be updating. Now with this huge set of growing data, will I land in to some sort of performance issue?

I thought of using multiple KV Store lookup broken down by month such as events from nov2015 will go to kvlookup_nov2015 and events from dec2015 will go to kvlookup_dec2015 based on ORDER_KEY creation time, all the collection and transforms.conf entries for lookup definitions will be made earlier only, but I am not able to achieve this as run time in search |outputlookup.

I tried the macro approach with eval based definition. |outputlookup `filename(ORDER_KEY)`.

[filename(1)]
args = ORDER_KEY
definition =(case(match($ORDER_KEY$, "^201511.*"),"csv_lookup_nov_2015",match($ORDER_KEY$,"^201512.*"),"csv_lookup_dec_2015"))

It did not work for me. Please help me out

Thanks in advance.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

KV Store is intended to be use for large data sets so I'd continue to use a single lookup rather than have to update your macro every month. If you have doubts, Google mongodb.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...