Dashboards & Visualizations

How to create a dashboard showing in progress, completed, and pending status

Mohsin123
Path Finder

Hi,
I am preparing a dashboard for Websphere team job monitoring. I have 29 jobs. There is a started kind of logging in server and also completed successfully kind of logging in server. I have to show 3 tables: in progress jobs, completed jobs, and pending jobs. So my logic is : I will show the latest job which started in the in progress table by using head 1 sorted by_time. In completed there will be all the jobs that are complete for today (not any other date data). In pending table I have to show : not received (not started/compete) jobs. Ex: I have 5 jobs, 's, b, c, d, e. My in progress displays only 1 say, b. Compete displays, say, a and c. So my pending will show d and e. Hope I m clear. If all are compete, I. E complete has a, b, c, d, e then in progress and complete will be NIL. Remember, I have to pick up logs for only TODAY.

I planned to keep a master list. My in progress will display head 1. Complete will display all. ONLY TODAY Data. Using mvappend, I can mix in progress and complete. I can subtract it from master list to show pending. Please, help.

0 Karma
1 Solution

DalJeanis
Legend

Here's one way.

your search that gets all the jobs that have started to today  AND all the jobs that have completed today
| eval status= (an if or case statement that sets this event to "started" or "complete" based on whether this event record is a start or a completed record)
| fields _time jobname status
| append [| inputcsv myfileofjobs.csv | table jobname | eval status= "pending" | eval _time =floor(now()/86400)]
| sort 0 - _time jobname 
| dedup jobname

The above will give you the most recent record for each job. This allows you not to worry if a job is restarted and has not yet completed, it will show as started.

Use the above as the base search, and each panel can have a postprocess search for its particular status flag, and sort by jobname or time depending on how you want to display the results. Probably let the user pick which sorting to use with a dropdown or radio buttons.

View solution in original post

0 Karma

DalJeanis
Legend

Here's one way.

your search that gets all the jobs that have started to today  AND all the jobs that have completed today
| eval status= (an if or case statement that sets this event to "started" or "complete" based on whether this event record is a start or a completed record)
| fields _time jobname status
| append [| inputcsv myfileofjobs.csv | table jobname | eval status= "pending" | eval _time =floor(now()/86400)]
| sort 0 - _time jobname 
| dedup jobname

The above will give you the most recent record for each job. This allows you not to worry if a job is restarted and has not yet completed, it will show as started.

Use the above as the base search, and each panel can have a postprocess search for its particular status flag, and sort by jobname or time depending on how you want to display the results. Probably let the user pick which sorting to use with a dropdown or radio buttons.

0 Karma

Mohsin123
Path Finder

OK I'll try that tomorrow morning. Thank you so much for your reply. Also I would like to get the started time - ended time value.. How to subtract times...

DalJeanis
Legend
 your search that gets all the jobs that have started to today  AND all the jobs that have completed today
 | eval status= (an if or case statement that sets this event to "started" or "complete" based on whether this event record is a start or a completed record)
 | fields _time jobname status
 | append [| inputcsv myfileofjobs.csv | table jobname | eval status= "pending" | eval _time =floor(now()/86400)]
 | eval starttime=case(status="started",_time)  
 | eval endtime=case(status="complete",_time)
 | stats max(_time) as maxtime, max(starttime) as starttime, max(endtime) as endtime, latest(status) as status by jobname
 | rename maxtime as _time
 | eval duration=case(endtime>starttime,endtime-starttime)
 | eval endtime=case(endtime>starttime,endtime)

That gets you the duration in seconds. Divide by 60 for minutes or 3600 for hours.

Take a look at the output lines - there should be 29 of them. You can adjust them however you want before continuing. We've set it up so that endtime disappears if the job has been restarted later.

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...