Splunk Search

Is using DELIMS in transforms.conf appropriate for a syslog event that contains null values?

tmarlette
Motivator

I have a syslog event, in which it's format remains constant, however i'm having some trouble leveraging transforms.conf (DELIMS) to pull this out

Here is an example of the data:

Jan 13 13:05:11 10.95.219.69 2017-01-13T13: 05:11.564-0500 "domain\myUser" "AGSID:oTNLYjm4OTBkZWUx" "" "10.10.10.10" "Login" "failed" "" "" "CitrixReceiver/com.enterprise.beta iOS/1.6.0 (build 1.6.0.21) CitrixReceiver-iPhone CFNetwork Darwin VpnCapable X1Class AuthManager/4.6.1.32" "NSG SSO login failed"

These are the settings in my transforms.conf:

[mySourcetype]
DELIMS = " "
FIELDS = user,field2,field3,field4,field5,type,field7,field8,field9,field10,field11,field12,field13,field14

I'm not sure if there is a delimiter setting that will help me here, but i'm open to suggestions.
I was thinking that I could use REGEX to set these fields, however the null values ("") are prohibiting me as there are no characters in between them.

Any suggestions are welcome, and appreciated.

Thank you !!

0 Karma

woodcock
Esteemed Legend

Try this instead:

[mySourcetype]
REGEX = ^([^"]+)\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"\s+"([^"]*)"$
FORMAT = user::$1 field2::$2 field3::$3 field4::$4 field5::$5 type::$6 field7::$7 field8::$8 field9::$9 field10::$10 field11::$11 field12::$12 field13::$13 field14::$14
0 Karma

tmarlette
Motivator

This extracts some, but 4 out of 104 is less than tolerance.

the trouble I'm having is how to capture the field null values, because they are printed as "" when no value exists. In the next event, the value could be there. it looks like you're hitting the same thing.

0 Karma

woodcock
Esteemed Legend

If you expect us to give you good help, then you need to give us good raw data. Share more variety.

0 Karma

somesoni2
Revered Legend

You can user regular expression to extract field even if there are null values. Try like this

props.conf

[yoursourcetype]
EXTRACT-fields = ^([^\"]+)\"(?<user>[^\"]*)\"\s+\"(?<field2>[^\"]*)\"\s+\"(?<field3>[^\"]*)\"\s+\"(?<field4>[^\"]*)\"\s+\"(?<field5>[^\"]*)\"\s+\"(?<type>[^\"]*)\"\s+\"(?<field7>[^\"]*)\"\s+\"(?<field8>[^\"]*)\"\s+\"(?<field9>[^\"]*)\"\s+\"(?<field10>[^\"]*)\"\s+\"(?<field11>[^\"]*)\"\s+\"(?<field12>[^\"]*)\"\s+\"(?<field13>[^\"]*)\"\s+\"(?<field14>[^\"]*)\"\s+\"(?<field15>[^\"]*)\"$
0 Karma

niketn
Legend

If you use space as the delimiter you will end up truncating time-stamp into fields also hence the values might show up different than you expect.

You can add single data in test mode or load data in Data Preview mode to control your sourcetype. Make sure correct field is being used for identifying time-stamp.

If you are interested in user and type fields you can perform interactive field extractions using "Extract new fields" during search time.

user
^[^"\n]*"(?P[^"]+)

type
^(?:[^.\n]*.){7}\d+"\s+"\w+"\s+"(?P\w+)

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...