Getting Data In

training splunk to recognize events from bacula

ebailey
Communicator

I am running into trouble getting splunk to properly break down events from bacula. Below is an example of a bacula event. The below represents a single backup job in the bacula log. I thought it would be easy to setup since every backup job has a common JobId. I thought I would be able to give splunk a line_breaker and then a regex but that is not working. Do I need to use the transaction command instead? I appreciate any help since I am getting no where fast.

Thanks

Ed

14-Mar 04:09 xxx-dir JobId 211: Start Backup JobId 211, Job=xxx.2011-03-14_04.05.00_29
14-Mar 04:09 xxx-dir JobId 211: Using Device "FileStorage"
14-Mar 04:09 xxx-sd JobId 211: Volume "Vol0007" previously written, moving to end of data.
14-Mar 04:09 xxx-sd JobId 211: Ready to append to end of Volume "Vol0007" size=36444353196
14-Mar 04:09 xxx-sd JobId 211: Job write elapsed time = 00:00:01, Transfer rate = 1.871 M Bytes/second
14-Mar 04:09 xxx-dir JobId 211: Bacula xxx-dir 5.0.3 (30Aug10): 14-Mar-2011 04:09:16
  Build OS:               i686-redhat-linux-gnu redhat Enterprise release
  JobId:                  211
  Job:                    chi01fep110.2011-03-14_04.05.00_29
  Backup Level:           Incremental, since=2011-03-13 04:07:25
  Client:                 "xxx-fd" 5.0.3 (30Aug10) i686-redhat-linux-gnu,redhat,Enterprise release
  FileSet:                "standard_etc" 2011-03-03 23:05:00
  Pool:                   "File" (From Job resource)
  Catalog:                "MyCatalog" (From Client resource)
  Storage:                "File" (From Job resource)
  Scheduled time:         14-Mar-2011 04:05:00
  Start time:             14-Mar-2011 04:09:16
  End time:               14-Mar-2011 04:09:16
  Elapsed time:           0 secs
  Priority:               10
  FD Files Written:       784
  SD Files Written:       784
  FD Bytes Written:       1,788,470 (1.788 MB)
  SD Bytes Written:       1,871,767 (1.871 MB)
  Rate:                   0.0 KB/s
  Software Compression:   64.7 %
  VSS:                    no
  Encryption:             no
  Accurate:               no
  Volume name(s):         Vol0007
  Volume Session Id:      146
  Volume Session Time:    1299264299
  Last Volume Bytes:      36,446,249,241 (36.44 GB)
  Non-fatal FD errors:    0
  SD Errors:              0
  FD termination status:  OK
  SD termination status:  OK
  Termination:            Backup OK

14-Mar 04:09 xxx-dir JobId 211: Begin pruning Jobs older than 1 month .
14-Mar 04:09 xxx-dir JobId 211: No Jobs found to prune.
14-Mar 04:09 xxx-dir JobId 211: Begin pruning Jobs.
14-Mar 04:09 xxx-dir JobId 211: No Files found to prune.
14-Mar 04:09 xxx-dir JobId 211: End auto prune.
Tags (1)
0 Karma

dwaddle
SplunkTrust
SplunkTrust

Hi Ed,

In this instance I personally would probably treat this as several events and use transaction as needed to weld them back together. I would probably use a BREAK_ONLY_BEFORE in props.conf such that lines with a date/time on them delinate events.

[bacula]
BREAK_ONLY_BEFORE=^\d{2}-[A-Za-z]{3}\s+\d{2}:\d{2}\s+
SHOULD_LINEMERGE = true
TIME_FORMAT=%d-%b %H:%M
TIME_PREFIX=^
MAX_TIMESTAMP_LOOKAHEAD=13

Put that in your props.conf for this sourcetype. If it works right, your multi-line event should be treated as a single event. This will only take affect on data indexed after the change is made.

UPDATE - Ed, please try the above props.conf entry - it worked properly for me on your test data. Timestamps came out correct, and lines were delineated/merged appropriately.

dwaddle
SplunkTrust
SplunkTrust

Ed, see update above....

0 Karma

ebailey
Communicator

The event is still being broken up into pieces not related to the date/time. I added new data to the log after adding the above to the props.conf. Any ideas? Thanks!

0 Karma

ebailey
Communicator

Will give this a try and get back to you - Thanks!

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...