Why is the Splunk Web service not running after an upgrade to 6.2? Learn more »
I have a Cisco ACS serving radius requests for VPN users. The syslog is configured for splunk and is able to receive data and index all fields. Following are the sample log texts shown for a particular user (XXXX_user93) (only interesting fields)
a) RADIUS Start record
Feb 21 10:32:34 2012-02-21 10:32:34.134 +05:30 NOTICE Radius-Accounting: RADIUS Accounting start request, NetworkDeviceName=XXXX_Roam_Connect, User-Name=XXXX_user93, Framed-IP-Address=10.32.38.93, Calling-Station-ID=188.8.131.52, NAS-Identifier= YYYY-FG-MUMENT, Acct-Status-Type=Start, Acct-Session-Id=00a2fc9c, AcsSessionID= INMAA-TDL-ACS-I/112925452/1282834,
b)Corresponding RADIUS Stop record
Feb 21 10:32:41 2012-02-21 10:32:41.127 +05:30 NOTICE Radius-Accounting: RADIUS Accounting stop request, NetworkDeviceName=XXXX_Roam_Connect, User-Name=XXXX_user93, Framed-IP-Address=10.32.38.93, Calling-Station-ID=184.108.40.206, NAS-Identifier=YYYY-FG-MUMENT, Acct-Status-Type=Stop, Acct-Session-Id=00a2fc9c, Acct-Session-Time=468, Acct-Terminate-Cause=NAS Error, AcsSessionID=INMAA-TDL-ACS-I/112925452/1282838
The start and stop requests are correlated by the field Acct-Session-Id for which the value would be the same for a particular users start and stop record.
What we are looking is a daily, weekly, monthly report in a tabular format, something similar to this
NetworkDeviceName | Username | Starttime | Endtime | Acct-Session-Id | Acct-Session-Time
I have tried with no success
We are currently evaluating Splunk and would like help in achieving the above. Have searched for apps in Splunk base and couldnt find any.
Thanks in advance.
What ways have you tried without success? I can think of two ways you could achieve this fairly easily: either using
transaction or using
transaction: the only thing that might be a bit tricky is to get the start time and end time right.
transaction always produces the field
duration that is exactly what it says, the duration of the transaction (= the difference between the timestamps of the first and last event in the transaction), so adding that value to the transaction's
_time gives you the last time. I also did some time output formatting using
strftime in order to make it human readable. The
duration field could also be used instead of the
Acct-Session-Time field, in case you don't trust the duration calculated by the ACS :)
... | transaction Acct-Session-Id | eval Starttime=strftime(_time,"%+") | eval Endtime=strftime(_time+duration,"%+") | table NetworkDeviceName User-Name Starttime Endtime Acct-Session-Id Acct-Session-Time
The drawback with using
transaction is it can be pretty resource intensive.
2) You can use
stats and split on
Acct-Session-Id because it is a unique identifier for each session.
stats always requires a statistical operation to be run on each field it takes as an input, so with field carrying absolute values, just use
first() to get this. NOTE that I use
first(_time) for getting the Starttime and Lasttime, respectively - this might seem confusing, but has to do with how results arrive in reverse chronological order from the search pipeline, so the latest events comes first and therefore
first(_time) grabs the latest event it comes across. Also the
eval/strftime statements are again done in order to get human readable timestamps.
... | stats first(NetworkDeviceName) as NetworkDeviceName,first(Username) as Username,last(_time) as Starttime,first(_time) as Endtime,first(Acct-Session-Time) as Acct-Session-Time by Acct-Session-Id | eval Starttime=strftime(Starttime,"%+") | eval Endtime=strftime(Endtime,"%+") | table NetworkDeviceName Username Starttime Endtime Acct-Session-Id Acct-Session-Time