Getting Data In

How to search the total number of business hours of an event from a csv file?

karthikTIL
Path Finder

HI, I have a file called file1.csv whose fields are start_time,close_time

start_time close_time
22/08/2014 3:00:08 PM 26/08/2014 4:50:32 PM
25/08/2014 5:54:53 PM 26/08/2014 1:28:55 PM
25/08/2014 3:30:03 PM 27/08/2014 5:04:57 PM

i want to create third field which would be total_time, whose value will be (close_time) - (start_time).But while finding the total_time, i want splunk to give me total business time only
i.e., from 9AM to 5PM and monday to Friday

SO total_time for above three examples would be 17.50,4:28 and 9:30 respectively.
Kindly let me knwo what would be the Query in splunk to get the above result.Thank you.

Tags (4)
0 Karma
1 Solution

javiergn
Super Champion

Hi,

I was curious about your request so I tried this in my lab using your example csv and I think I managed to get the result you were expecting:

| inputcsv mycsv.csv

| eval start_time_epoch = strptime(start_time,"%d/%m/%Y %I:%M:%S %p")
| eval start_time_second = strftime(start_time_epoch,"%S")
| eval start_time_epoch_rounded = start_time_epoch - start_time_second
| fields - start_time_epoch, start_time_second

| eval close_time_epoch = strptime(close_time,"%d/%m/%Y %I:%M:%S %p")
| eval close_time_second = strftime(close_time_epoch,"%S")
| eval close_time_epoch_rounded = close_time_epoch - close_time_second
| fields - close_time_epoch, close_time_second

| eval minute = mvrange(0, (close_time_epoch_rounded - start_time_epoch_rounded), 60)
| mvexpand minute

| eval _time = start_time_epoch_rounded + minute
| eval myHour = strftime(_time,"%H")
| eval myMinute = strftime(_time,"%H")
| eval myDay = strftime(_time,"%A")

| where myDay != "Saturday" AND myDay != "Sunday" AND myHour >= 9 AND (myHour < 17 OR (myHour = 17 AND myMinute = 0))
| stats count as durationInMinutes by start_time, close_time
| eval duration = tostring(durationInMinutes*60, "duration")

Output:

start_time  close_time  durationInMinutes   duration
22/08/2014 3:00:08 PM   26/08/2014 4:50:32 PM   1070    17:50:00
25/08/2014 3:30:03 PM   27/08/2014 5:04:57 PM   1050    17:30:00
25/08/2014 5:54:53 PM   26/08/2014 1:28:55 PM   268     04:28:00 

Thanks,
J

View solution in original post

javiergn
Super Champion

Hi,

I was curious about your request so I tried this in my lab using your example csv and I think I managed to get the result you were expecting:

| inputcsv mycsv.csv

| eval start_time_epoch = strptime(start_time,"%d/%m/%Y %I:%M:%S %p")
| eval start_time_second = strftime(start_time_epoch,"%S")
| eval start_time_epoch_rounded = start_time_epoch - start_time_second
| fields - start_time_epoch, start_time_second

| eval close_time_epoch = strptime(close_time,"%d/%m/%Y %I:%M:%S %p")
| eval close_time_second = strftime(close_time_epoch,"%S")
| eval close_time_epoch_rounded = close_time_epoch - close_time_second
| fields - close_time_epoch, close_time_second

| eval minute = mvrange(0, (close_time_epoch_rounded - start_time_epoch_rounded), 60)
| mvexpand minute

| eval _time = start_time_epoch_rounded + minute
| eval myHour = strftime(_time,"%H")
| eval myMinute = strftime(_time,"%H")
| eval myDay = strftime(_time,"%A")

| where myDay != "Saturday" AND myDay != "Sunday" AND myHour >= 9 AND (myHour < 17 OR (myHour = 17 AND myMinute = 0))
| stats count as durationInMinutes by start_time, close_time
| eval duration = tostring(durationInMinutes*60, "duration")

Output:

start_time  close_time  durationInMinutes   duration
22/08/2014 3:00:08 PM   26/08/2014 4:50:32 PM   1070    17:50:00
25/08/2014 3:30:03 PM   27/08/2014 5:04:57 PM   1050    17:30:00
25/08/2014 5:54:53 PM   26/08/2014 1:28:55 PM   268     04:28:00 

Thanks,
J

nikkkc
Path Finder

Wow many thanks,
i am glad! it works perfectly, you are the best!!! 🙂

0 Karma

nikkkc
Path Finder

one more question, is it possible to explain your search query a little bit... so i could learn something? thanks

0 Karma

javiergn
Super Champion

Sure, I guess the meat is in the following lines:

| eval minute = mvrange(0, (close_time_epoch_rounded - start_time_epoch_rounded), 60)
| mvexpand minute
| eval _time = start_time_epoch_rounded + minute
....
| where myDay != "Saturday" AND myDay != "Sunday" AND myHour >= 9 AND (myHour < 17 OR (myHour = 17 AND myMinute = 0))
| stats count as durationInMinutes by start_time, close_time
| eval duration = tostring(durationInMinutes*60, "duration")

Because you only have to static timestamps (start and end), mvrange + mvexpand will generate one event every 60 seconds from 0 to the difference in seconds between start and close times.
The eval _time will then add that minute to the current start time. For instance:

_time, start_time, close_time
17/Feb/2016 9:00:00, 17/Feb/2016 9:00:00, 17/Feb/2016 9:05:00

Will become:

_time, start_time, close_time
17/Feb/2016 9:00, 17/Feb/2016 9:00:00, 17/Feb/2016 9:05:00
17/Feb/2016 9:01, 17/Feb/2016 9:00:00, 17/Feb/2016 9:05:00
17/Feb/2016 9:02, 17/Feb/2016 9:00:00, 17/Feb/2016 9:05:00
17/Feb/2016 9:03, 17/Feb/2016 9:00:00, 17/Feb/2016 9:05:00
17/Feb/2016 9:04, 17/Feb/2016 9:00:00, 17/Feb/2016 9:05:00

We then filter by working days and business hours only.
And we count the total number of minutes between start_time and close_time

The eval duration is just to display this in a nice format.

0 Karma

HeinzWaescher
Motivator

I'm sure there is an easier way, but you could try something like this to build a filter before your calculation:

| eval start_weekday=strftime(strptime(start_time,"%d/%m/%Y %I:%M:%S %p"),"%w")
| eval start_hour=strftime(strptime(start_time,"%d/%m/%Y %I:%M:%S %p"),"%H")

| eval close_weekday=strftime(strptime(close_time,"%d/%m/%Y %I:%M:%S %p"),"%w")
| eval close_hour=strftime(strptime(close_time,"%d/%m/%Y %I:%M:%S %p"),"%H")

| search start_weekday>=1 AND start_weekday<=5 AND start_hour>=9 AND start_hour<=17 AND close_weekday>=1 AND close_weekday<=5 AND close_hour>=9 AND close_hour<=17

0 Karma

nikkkc
Path Finder

i have nearly the same problem, did someone have a answer? thanks

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...