Splunk Search

How do I get the lower and upper bound of a timechart and use these in an eval?

dojiepreji
Path Finder

Is there any way to get the upper and lower bound dates for a timechart that has a span of weeks?

| timechart span=w@w0 count(eval(current_ticket_state)) as Assigned, count(eval(resolved_date < lower_bound_date_of_week OR resolved_date > upper_bound_date_of_week) as Open

I only need to count the entries in which the resolved_date fits in the weekly range. My resolved_date is not equal to my _time.

0 Karma
1 Solution

DalJeanis
SplunkTrust
SplunkTrust

1) You could either reassign the resolved_date to the _time field, or calculate a field that tells you what you need to know, but I suspect, given the below items, that you need to rethink your whole approach.

2) eval(current_ticket_state) doesn't seem to make sense to me. It looks like maybe you were trying to count which ones were in status "Assigned", but that wouldn't do anything like that.

3) Counting a record that was resolved BEFORE the week in question as "Open" makes no sense.

4) I'm betting that your underlying ticket data is more complex than you are accounting for. For instance, each ticket probably has multiple events that record its state at various points in time, so you need to identify which ones matter for your use case and eliminate all the ones that don't.

5) Given all of the above, I'm betting that timechart is the wrong tool for what you are looking for.


We're assuming that this is your use case, in plain English:

"We want a chart of how many tickets are opened (assigned) in any given week, and how many are resolved.
We want to know how many open tickets exist at the beginning of each week, how many new tickets are opened, how many resolved, and how many remain open at the end of the week."

The easiest way to do this is to roll all the various events for each ticket together into a single unit, then break that unit into two pieces... the "open" record and the "close" record. Then you can just sum them up by week, and run a running net calculation using the streamstats function.

That looks like this... be sure to substitute the actual ticket_number field on your system.

 ( your search that gets the desired tickets)

| rename COMMENT as "create one record per ticket, then default the open date and the close date if not found"
| stats min(eval(case(current_ticket_state="Assigned",_time))) as open_date, 
    min(resolved_date) as resolved_date by ticket_number
| eval open_date=coalesce(open_date,resolved_date), resolved_date=coalesce(resolved_date,now()+7*86400)

| rename COMMENT as "turn each ticket into two records, one representing the open, one the close"
| eval mydates=mvappend(open_date,resolved_date)
| table mydates
| eval myfan=mvrange(0,2)
| mvexpand myfan
| eval mydates = mvindex(mydates,myfan)

| rename COMMENT as "bin the data into weeks, and sum up all the opens and closes for any given week"
| bin mydates as _time span=w@w0
| eval addOne=case(myfan=0,+1)
| eval subtractOne= case(myfan=1,+1)
| stats sum(addOne) as addSome sum(subtractOne) as subtractSome by _time
| eval netChange= addSome - subtractSome

| rename COMMENT as "Now, use streamstats to track the open tickets over time"
| streamstats sum(netChange) as OpenAtEndOfWeek,
    latest(addSome) as AddedThisWeek ,
    latest(subtractSome) as ClosedThisWeek
| table _time OpenAtEndOfWeek AddedThisWeek ClosedThisWeek
| where _time <= now()

The last line is to throw away the default resolved data for tickets that are not actually resolved.

View solution in original post

DalJeanis
SplunkTrust
SplunkTrust

1) You could either reassign the resolved_date to the _time field, or calculate a field that tells you what you need to know, but I suspect, given the below items, that you need to rethink your whole approach.

2) eval(current_ticket_state) doesn't seem to make sense to me. It looks like maybe you were trying to count which ones were in status "Assigned", but that wouldn't do anything like that.

3) Counting a record that was resolved BEFORE the week in question as "Open" makes no sense.

4) I'm betting that your underlying ticket data is more complex than you are accounting for. For instance, each ticket probably has multiple events that record its state at various points in time, so you need to identify which ones matter for your use case and eliminate all the ones that don't.

5) Given all of the above, I'm betting that timechart is the wrong tool for what you are looking for.


We're assuming that this is your use case, in plain English:

"We want a chart of how many tickets are opened (assigned) in any given week, and how many are resolved.
We want to know how many open tickets exist at the beginning of each week, how many new tickets are opened, how many resolved, and how many remain open at the end of the week."

The easiest way to do this is to roll all the various events for each ticket together into a single unit, then break that unit into two pieces... the "open" record and the "close" record. Then you can just sum them up by week, and run a running net calculation using the streamstats function.

That looks like this... be sure to substitute the actual ticket_number field on your system.

 ( your search that gets the desired tickets)

| rename COMMENT as "create one record per ticket, then default the open date and the close date if not found"
| stats min(eval(case(current_ticket_state="Assigned",_time))) as open_date, 
    min(resolved_date) as resolved_date by ticket_number
| eval open_date=coalesce(open_date,resolved_date), resolved_date=coalesce(resolved_date,now()+7*86400)

| rename COMMENT as "turn each ticket into two records, one representing the open, one the close"
| eval mydates=mvappend(open_date,resolved_date)
| table mydates
| eval myfan=mvrange(0,2)
| mvexpand myfan
| eval mydates = mvindex(mydates,myfan)

| rename COMMENT as "bin the data into weeks, and sum up all the opens and closes for any given week"
| bin mydates as _time span=w@w0
| eval addOne=case(myfan=0,+1)
| eval subtractOne= case(myfan=1,+1)
| stats sum(addOne) as addSome sum(subtractOne) as subtractSome by _time
| eval netChange= addSome - subtractSome

| rename COMMENT as "Now, use streamstats to track the open tickets over time"
| streamstats sum(netChange) as OpenAtEndOfWeek,
    latest(addSome) as AddedThisWeek ,
    latest(subtractSome) as ClosedThisWeek
| table _time OpenAtEndOfWeek AddedThisWeek ClosedThisWeek
| where _time <= now()

The last line is to throw away the default resolved data for tickets that are not actually resolved.

cpetterborg
SplunkTrust
SplunkTrust

What exactly do you mean by lower_bound_date_of_week and upper_bound_date_of_week?

0 Karma

richgalloway
SplunkTrust
SplunkTrust

If all you need is the count and not a graph of counts, then consider using stats instead of timechart.

... | bucket span=w@w0 _time | stats count(eval(current_ticket_state)) as Assigned, count(eval(resolved_date < lower_bound_date_of_week OR resolved_date > upper_bound_date_of_week) as Open
---
If this reply helps you, Karma would be appreciated.
0 Karma

efavreau
Motivator

1) Consider creating your time boundaries with the time range picker or by using earliest=**** latest=**** time modifiers (https://docs.splunk.com/Documentation/Splunk/7.2.1/SearchReference/SearchTimeModifiers)
2) Assuming Resolved Date comes from the incident data. But where are the values for the upper and lower bounds coming from?
3) Filter your timespan prior to the timechart command. In this manner, the only events that are making it to the timechart are relevant things. Try to filter as much as you can before the first pipe, for performance reasons.

###

If this reply helps you, an upvote would be appreciated.
0 Karma

woodcock
Esteemed Legend

I do not understand your desire here. It will help greatly if you post some sample events and a mockup of your desired output.

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...