Reporting

Any Advice on Updating a Historical KV Store for Vulnerability Data with Scheduled Search?

jaspersplunkfu
Engager

I have a use case in which I am attempting to create a historical KV Store, with key field value pairs as:

host = string (attempting to use as unique _key)

Vulnerabilities discovered on 1/1 = string

Vulnerabilities discovered on 1/8 = string

Vulnerabilities discovered on 1/15 = string

 

For the initial data, I am trying to list out all the vulnerability signatures that have been discovered on each device from a weekly scan delimited by ";" in an effort to consolidate all the signatures discovered, while aiming to update this for each output of the weekly scan by using the host as the _key value. I have this example query that explains what I am attempting to initially do with the scan output on 1/1:

index=vulnerabilitydata

| stats values(signature) as Signatures) by host

| eval Signatures=mvjoin(Signatures, "; ")

| rename Signatures as "Vulnerabilities scanned in 1/1

| outputlook HISTORICAL_VULN_KV

As you can imagine, I get a list of vulnerability signatures in a multivalue field for each host in our network.

What I am trying to figure out is how I can craft a scheduled weekly search to:

1. Append new vulnerability signatures to each host, if a host is already listed as a _key unique value within the existing KVStore

2. Create a new record if a new host (_key) has discovered vulnerabilities to start tracking this for future scans

3. Bonus: Dynamically rename Signatures to the scan date, in my experience with using rename it seems to only rename with a static string value, without being able to incorporate Splunk timestamp logic for dynamic field naming. 

 

I feel like this should be rather easy to accomplish, but I have tried messing around with append=true and subsearches to get this to update new scan data for each host but I have been running into issues with updating, and maintaining results for multiple separate scan dates across each host. I would like capture everything properly, and then using outputlookup to update the original KV store to maintain a historical record for each weekly scan over time.

Do I need to be more mindful of how I could be using outputlookup in conjunction with key_field=host/_key?

 

Labels (2)
0 Karma
Get Updates on the Splunk Community!

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer at Splunk .conf24 ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...