Splunk Enterprise

Oneshot and Kvstore update

tlmayes
Contributor

Need to upload the contents of a CSV that exceeds the size allowed in our web.conf. Will modify this as a last resort, but.....

Ultimately need to populate a Kvstore with the contents of a 100+MB CSV. Investigated Oneshot, but don't see how to ingest the file contents into a Kvstore, only an index. Any suggestions would be welcome.

0 Karma
1 Solution

woodcock
Esteemed Legend

Why do you need it in a kvstore? Can you not simply deploy it as a file-based lookup?

View solution in original post

0 Karma

woodcock
Esteemed Legend

Why do you need it in a kvstore? Can you not simply deploy it as a file-based lookup?

0 Karma

tlmayes
Contributor

I can, and did with multiple smaller files. My concern was performance when accessing a 200k+ row file, which will be leveraged regularly (asset library). If not a big deal (haven't been there yet) then am fine with using file-based.

If so, the question becomes: how can I populate a file-based lookup without modifying my web.conf, and without having to breakup the file into smaller chunks for ingestion?

0 Karma

woodcock
Esteemed Legend

There is no limit if you are using lookup; only if you are using inputlookup. In the super-majority of cases when people think that they should be using inputlookup, they really should be using lookup. What exactly are you needing to do and we can probably make it use lookup and you will be golden.

0 Karma

tlmayes
Contributor

To the question of "what exactly are you needing"? At this point, only to make the data available. TBD how the community of users will "use" it (so an unknown at this point). Get your point (I think) about using lookup for a refined search vs. inputlookup for a simple "listing"

So, I use a lookup table, the question is similar. Is there a better way to upload the file, avoiding the modification of web.conf (which is currently blocking the upload)?

0 Karma

woodcock
Esteemed Legend

You just put the file where it needs to be via any means available. I am not aware that the GUI-based "upload a lookup file" has any size limit (I am pretty sure that it does not). Give it a try.

0 Karma

tlmayes
Contributor

It's the simple things in life that always get in the way 😕

Yes, the gui has a limit, that keeps the users out of trouble... but as you stated, why don't I simply UPLOAD it via the CLI to the same location as the others.... Wait a minute, I recall something about some forests and trees....Thanks for the "nudge"

And yes, scp to each of the WFE's worked just fine

0 Karma

woodcock
Esteemed Legend

I was thinking Deployment Server, but that works!

0 Karma

woodcock
Esteemed Legend

I have done millions of long rows with DS and also app install (lookups built into an app).

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...