Monitoring Splunk

Indexing logs as event

danillopavan
Communicator

Hello all,

I would like to monitor a file that is being changed every 15 minutes (unique file in the directory) and it is a very large log file (almost 100MB ~ 150MB). I have some questions about that:

  1. Is there any way to index just the recent changes and not the entire file? I read something about the followTail setting, however not sure if it is really appropriated;
  2. Index just some lines of the log and not the entire recent changes? Something like: all the recent lines that starts with specific text..Maybe apply here REGEX...
  3. Index the lines of the item 2 as one single event for each repetition... Example the below log file content:

S Sß: (2017120211271200) sending job @>SPOREQ:1597246@DEV:JC15@<'
S 2 pages (OTF) printed in 0 seconds, avg. 0.0 pages per sec
S Timeinfo @>SPOREQ:587821@DEV:DS01@<): 0 1 List ( 0 0 0 0 0 0 )
S Sß: (2017120211271300) ....end job @>SPOREQ:1597246@DEV:JC15@<'
S <-- Job @>SPOREQ:1597246@</1 processed (rc=0) }

And then have one single line event containing the below information based on the above 5 lines:
Start Time | Number of SPOREQ| Printer Name | Quantity of pages | Duration of print | Avg of print | Finish Time | Status

Many many many thanks for the support!
Danillo Pavan

Tags (1)
0 Karma

woodcock
Esteemed Legend
0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...