Hello all,
I'm relatively new to splunk and have been trying to correlate a series of events that occur in our logs.
2013-01-14 11:12:20,512 [71] 54110 INFO WebService RequestTypeA .......
2013-01-14 11:12:23,512 [71] 54110 INFO WebService UserLogin: Tester .......
2013-01-14 11:12:25,512 [71] 54110 INFO WebService Response .......
The log is receiving thousands of entries per minute, so the way I've been handling this manually is grepping through our log files for:
[71] 54110
Because not all of the information is ever present within a single log entry, I'd like to chain them together using the transaction command by using a field that that matches based on the following regex:
\[\d+\]\s+\[\d+\]\
Which seems to work based on what I'm seeing at: http://regexpal.com/ . Is this a scenario where I would need to create a custom field at index time?
The search I'd like to run would look for a particular type of request, look up its corresponding unique identifier ([71] 54110) and group them as a transaction. Then take that result set and look up all fields containing "UserLogin) - search might look something lie:
index="main" "RequestTypeA" | transaction [CUSTOMKEY] | UserLogin
To do this - I've modified -
tranforms.conf
[pididThreadid]
REGEX = \[\d+\]\s+\[\d+\]
FORMAT = pid_thread_key::$1$2
WRITE_META = true
REPEAT_MATCH = false
CLEAN_KEYS = 1
props.conf
[log4j]
TRANSFORMS-pididThreadid = pididThreadid
fields.conf
[pid_thread_key]
INDEXED=true
Am I on the right track here?
... View more