Getting Data In

How to regularly write filtered events to a new index?

xsstest
Communicator

if I have an index test, the index has too many events, I need to filter by keyword and write the result to the index Useful_logs.

for example:

Filter conditions:

index=test sourcetype=abc "login" "user" "deviceId"

then at the zero of every day,filter the events of the previous day

Write the filtered event to index Useful_logs

Finally, I can use index=Useful_logs to search for the log I want.

Of course, maybe some friends will let me configure the "transforms.conf" file.
But I want to keep all the logs of the test index, but also write useful logs to the new index (the Useful_logs index).

So what should I do?

0 Karma

s2_splunk
Splunk Employee
Splunk Employee

Why not simply setup an eventtype that returns just your useful data?

0 Karma

xsstest
Communicator

What should I do ?

0 Karma

s2_splunk
Splunk Employee
Splunk Employee

The simplest is to define an eventtype, as suggested, and use that for searching your useful logs. I'd recommend reading up on it and applying it to your problem to see if that helps.
If that doesn't solve your problem, please describe your problem in more detail.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi xsstest,
reindexing already indexed logs isn't a good idea because you have double license consuption!

If you can identify the logs you want using source or sourcetype, you can see that, also having many different events in your index, your search will be very quick.

If you want anyway to put the selected logs in another index you could do two things:

  • put the selected logs in a summary index,
  • extract the selected logs and reindex them in the new index.

First solution has the advantage that you don't have double license consuption and that your searches will be very performant, but in this way you have ro rebuild all your searches and field extraction because you have to use a differtent syntax in searches.
To have this you have to run something like this:

index=test sourcetype=abc "login" "user" "deviceId"
| table _time _raw
| tscollect namespace=Useful_logs

to access this logs use

| tstats count FROM Useful_logs GROUPBY _time _raw | ...

To run the second solution run your search index=test sourcetype=abc "login" "user" "deviceId" and then download results in raw format that you can newly ingest in the new index.
This second solution has double license consuption but your searches will be the same (only different index).

Bye.
Giuseppe

0 Karma

xsstest
Communicator

Use the summary index?

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...