Getting Data In

How to update many KV Store records from results of a search?

MartinLenggenha
Explorer

Hi Ninjas,
I have been playing with KV Store and am wondering if anyone of you has updated table with multiple results from a search. The background is a data feed from ticketing application where I get an event into index when a new ticket gets generated. After every update, I receive another event with the updates made.

In order to provide statistics of how many tickets are created for a particular date/time, how long a ticket has been idle and how Long it took from creation until closed, I would need to store the creation time into a static table and update it with a savedsearch periodically. I tried it with as little fields as possible.

My transforms.conf Looks like:

[ticket_lookup]
external_type = kvstore
collection = ticket_status
fields_list = _key TS_new Status_new TS_mod Status_mod

The _key is mapped to TicketID when I load the initial data (saved search)
The initial data saved search fills the fields _key, TS_new, Status_new

index=test... | eval _key = TicketID | table _key TS_new Status_new | outputlookup ticket_lookup append=true

This updates the table just well.

Now I would like to modify the table with data for TS_mod and Status_mod. For that I would run a query that would search and filter all the TicketID's and the *mod data. But just running the below, would overwrite the _new data which is not my intention.

index=test... | eval _key = TicketID | table _key TS_mod Status_mod | outputlookup ticket_lookup append=true

The _new data can be several months old, running a query that far in history, I would like to avoid. Also there could be several updates to *_mod until final closed status is reached. We only would write the latest status when saved search is run.

Could anyone point me into a direction how I could arrange the query to update the KV Store without deleting the original *_new entries?

Many thanks!
Martin

0 Karma
1 Solution

MartinLenggenha
Explorer

Hello,

I would like to update you on this question and propose a solution that I have just setup and tested. since I could not figure out how to update only two columns with the result of a search including a large number of rows, I have decided to split the KV Store into two tables.
One with the "fields_list" for the new TicketID's including _key TS and Status (collection = ticket_status), and a second KV Store containing the latest update where I would feed the results from the updating Splunk search (collection = ticket_update_status). These two tables can be included in a search using inputlookup and the duration calculated where required.

Here is the new config:

collections.conf

[ticket_status]
[ticket_update_status]

transforms.conf

[ticket_lookup]
external_type = kvstore
collection = ticket_status
fields_list = _key TS Status

[ticket_update_lookup]
external_type = kvstore
collection = ticket_update_status
fields_list = _key TS Status

The _new and _mod is now not necessary anymore since we have the two tables separately. This does make the writing of the queries simpler too.
To update the ticket_status KV Store table for new created tickets now looks like this:

index=test... | eval _key = TicketID | table _key TS Status | outputlookup ticket_lookup append=true

And the update with latest status after tickets are modified goes into the ticket_update_lookup KV Store table:

index=test... | eval _key = TicketID | table _key TS Status | outputlookup ticket_update_lookup append=true

For querying the two tables with lookup and calculate the ticket duration based on the two TS values:

| inputlookup ticket_lookup | inputlookup append=t ticket_update_lookup | eval TicketID = _key |  stats dc(Status) as check min(TS) as min_ts max(TS) as max_ts by TicketID |  eval ticket_duration=(max_ts-min_ts)/60 | where check>1 | table TicketID check ticket_duration

View solution in original post

MartinLenggenha
Explorer

Hello,

I would like to update you on this question and propose a solution that I have just setup and tested. since I could not figure out how to update only two columns with the result of a search including a large number of rows, I have decided to split the KV Store into two tables.
One with the "fields_list" for the new TicketID's including _key TS and Status (collection = ticket_status), and a second KV Store containing the latest update where I would feed the results from the updating Splunk search (collection = ticket_update_status). These two tables can be included in a search using inputlookup and the duration calculated where required.

Here is the new config:

collections.conf

[ticket_status]
[ticket_update_status]

transforms.conf

[ticket_lookup]
external_type = kvstore
collection = ticket_status
fields_list = _key TS Status

[ticket_update_lookup]
external_type = kvstore
collection = ticket_update_status
fields_list = _key TS Status

The _new and _mod is now not necessary anymore since we have the two tables separately. This does make the writing of the queries simpler too.
To update the ticket_status KV Store table for new created tickets now looks like this:

index=test... | eval _key = TicketID | table _key TS Status | outputlookup ticket_lookup append=true

And the update with latest status after tickets are modified goes into the ticket_update_lookup KV Store table:

index=test... | eval _key = TicketID | table _key TS Status | outputlookup ticket_update_lookup append=true

For querying the two tables with lookup and calculate the ticket duration based on the two TS values:

| inputlookup ticket_lookup | inputlookup append=t ticket_update_lookup | eval TicketID = _key |  stats dc(Status) as check min(TS) as min_ts max(TS) as max_ts by TicketID |  eval ticket_duration=(max_ts-min_ts)/60 | where check>1 | table TicketID check ticket_duration
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...