Refine your search:

I have a Cisco ACS serving radius requests for VPN users. The syslog is configured for splunk and is able to receive data and index all fields. Following are the sample log texts shown for a particular user (XXXX_user93) (only interesting fields) a) RADIUS Start record

Feb 21 10:32:34 2012-02-21 10:32:34.134 +05:30 NOTICE Radius-Accounting: RADIUS Accounting start request, NetworkDeviceName=XXXX_Roam_Connect, User-Name=XXXX_user93, Framed-IP-Address=10.32.38.93, Calling-Station-ID=113.128.64.130, NAS-Identifier= YYYY-FG-MUMENT, Acct-Status-Type=Start, Acct-Session-Id=00a2fc9c, AcsSessionID= INMAA-TDL-ACS-I/112925452/1282834,

b)Corresponding RADIUS Stop record Feb 21 10:32:41 2012-02-21 10:32:41.127 +05:30 NOTICE Radius-Accounting: RADIUS Accounting stop request, NetworkDeviceName=XXXX_Roam_Connect, User-Name=XXXX_user93, Framed-IP-Address=10.32.38.93, Calling-Station-ID=113.128.64.130, NAS-Identifier=YYYY-FG-MUMENT, Acct-Status-Type=Stop, Acct-Session-Id=00a2fc9c, Acct-Session-Time=468, Acct-Terminate-Cause=NAS Error, AcsSessionID=INMAA-TDL-ACS-I/112925452/1282838

The start and stop requests are correlated by the field Acct-Session-Id for which the value would be the same for a particular users start and stop record.

What we are looking is a daily, weekly, monthly report in a tabular format, something similar to this

NetworkDeviceName | Username | Starttime | Endtime | Acct-Session-Id | Acct-Session-Time

I have tried with no success

We are currently evaluating Splunk and would like help in achieving the above. Have searched for apps in Splunk base and couldnt find any.

Thanks in advance.

asked 20 Feb '12, 21:28

raki's gravatar image

raki
23
accept rate: 0%


One Answer:

What ways have you tried without success? I can think of two ways you could achieve this fairly easily: either using transaction or using stats.

1) transaction: the only thing that might be a bit tricky is to get the start time and end time right. transaction always produces the field duration that is exactly what it says, the duration of the transaction (= the difference between the timestamps of the first and last event in the transaction), so adding that value to the transaction's _time gives you the last time. I also did some time output formatting using strftime in order to make it human readable. The duration field could also be used instead of the Acct-Session-Time field, in case you don't trust the duration calculated by the ACS :)

... | transaction Acct-Session-Id | eval Starttime=strftime(_time,"%+") | eval Endtime=strftime(_time+duration,"%+") | table NetworkDeviceName User-Name Starttime Endtime Acct-Session-Id Acct-Session-Time

The drawback with using transaction is it can be pretty resource intensive.

2) You can use stats and split on Acct-Session-Id because it is a unique identifier for each session. stats always requires a statistical operation to be run on each field it takes as an input, so with field carrying absolute values, just use first() to get this. NOTE that I use last(_time) and stats0 for getting the Starttime and Lasttime, respectively - this might seem confusing, but has to do with how results arrive in reverse chronological order from the search pipeline, so the latest events comes first and therefore stats0 grabs the latest event it comes across. Also the stats1 statements are again done in order to get human readable timestamps.

stats2
link

answered 20 Feb '12, 23:37

Ayn's gravatar image

Ayn ♦
34.4k3817
accept rate: 40%

Thanks Ayn, have been trying the transaction method without any success. Anyway applied your searches and here are the results

a) The transaction method works perfect. ONe question I have though is how can I group the table by username such as a single username has multiple records. I would want to publish a report by username and all associated records

b) The stats method works but is there a way I can get the duration as in transaction than Acc_Session_Time as reported by ACS

(21 Feb '12, 02:38) raki
1

a) A mix of transaction and stats might be a good idea - use the basic command in 1) and then do something like in 2), adding a "count" to the stats.

b) You can eval the difference between the last and first timestamp for each stats entry, which will give you the duration.

(21 Feb '12, 06:52) Ayn ♦
2

For doing this in a production situation - especially with a monthly report - I would look into summary indexing. Also, depending on how long someone is connected, it may be better to maintain connection state in a lookup table, ala http://blogs.splunk.com/2011/01/11/maintaining-state-of-the-union/

(21 Feb '12, 06:59) dwaddle ♦
Post your answer
toggle preview

Follow this question

Log In to enable email subscriptions

RSS:

Answers

Answers + Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×175
×2

Asked: 20 Feb '12, 21:28

Seen: 1,810 times

Last updated: 21 Feb '12, 06:59

Copyright © 2005-2014 Splunk Inc. All rights reserved.