Splunk Search

Help with Transforming an ingest of timestamped data into a weekly backlog graph

Tomten72
Loves-to-Learn

Hi forum!

I have a couple of tricky questions on working with same indata and same type of graphs...

I am currently working on some Jira ticket weekly inflow/outflow/backlog column graphs, based on ticket data ingested nightly into Splunk index, from Confluence Rest API into simple CSV data.

Essential JIRA fields -> CSV fields used:

key (ticket ID), created (datestamp), resolved (datestamp), priority

Each day we ingest ALL JIRA tickets for a project into Splunk index (lets say index "project" keyed with source "jira_bugs". Theoretically I should be able to use the last 24 hour ingest into splunk to drive the graphs based on the dates in the data.

 

Based on that I create a weekly Jira ticket inflow graph, over the last 6 months of data (last 24 hours data):

 

 

 

index="project" source="jira_bugs" | dedup key | 
eval created_date = strptime(created,"%Y-%m-%d") | 
eval resolved_date = strptime(resolved,"%Y-%m-%d") | 
eval _time = created_date | 
where _time > now()-15811200 | 
timechart span=1w partial=false count AS "TR inflow" BY priority | 
addtotals | 
convert timeformat=""W%y%V"" ctime(_time) AS date | 
sort _time | fields - _time | table date *

 

 

 

So, based on the ticket's created date, I use that as _time, I span the data in timechart into weekly scope and then change the time label to a Week label Wyyww. The output drives a stacked columns (per priority), and an overlay linegraph for totals.

Outflow is dito, but _time is instead driven by resolved_date.

The problem with this approach is that if there is no tickets created for a week, it should (but does not) render an empty space for that week. I am thinking that I may perhaps need to chain this query with a preceeding gentimes (7d increment starting 6 months ago?), and then somehow group the count of tickets into the generated time events?

 

Secondly, I need to create a weekly Jira ticket backlog graph, and this feels even more tricky. For this, I need to count the number of tickets per week that fits within a certain time range, meaning I need to count a ticket for each week if it was open (evaluating if [created_date] < week or certain day] < [resolved_date]).

So same ticket shall be counted (duplicated with different _time stamps?) over several weeks (columns in graph) for which it was open.

Seems like a simple thing, but each time I attack this problem I give up after googling and testing a number of ideas from Splunk! reference and forum.

 

Labels (3)
0 Karma
Get Updates on the Splunk Community!

Enter the Dashboard Challenge and Watch the .conf24 Global Broadcast!

The Splunk Community Dashboard Challenge is still happening, and it's not too late to enter for the week of ...

Join Us at the Builder Bar at .conf24 – Empowering Innovation and Collaboration

What is the Builder Bar? The Builder Bar is more than just a place; it's a hub of creativity, collaboration, ...

Combine Multiline Logs into a Single Event with SOCK - a Guide for Advanced Users

This article is the continuation of the “Combine multiline logs into a single event with SOCK - a step-by-step ...