Splunk Search

Why does appendcols cause a search to produce different results?

andrewtrobec
Motivator

Hello all,

I have an index of events, each of which has an enter and exit timestamp where _time is associated to the former.

My first search is very simple and shows the enter daily frequency:

index=my_index
| timechart span=1d count as EnterCount

This produces the following table:

_time,EnterCount
2016-08-08,0
2016-08-09,16
2016-08-10,20
2016-08-11,23
2016-08-12,13
2016-08-13,2
2016-08-14,0
2016-08-15,0
2016-08-16,11

I then have another search that shows the exit daily frequency:

index=my_index
| eval _time = strptime(Exit, "%Y-%m-%d %H:%M:%S")
| timechart span=1d count as ExitCount

This produces the following table:

_time,ExitCount
2016-08-08,1
2016-08-09,4
2016-08-10,9
2016-08-11,16
2016-08-12,4
2016-08-13,0
2016-08-14,0
2016-08-15,0
2016-08-16,4

In order to chart this relationship between enter and exit, I use appendcols to put them together:

index=my_index
| timechart span=1d count as EnterCount
| appendcols [ search index=my_index
 | eval _time = strptime(Exit, "%Y-%m-%d %H:%M:%S")
 | timechart span=1d count as ExitCount ]

The resulting table, however, displays incorrect values for ExitCount by bringing the results forward by 3 days:

_time,EnterCount,ExitCount
2016-08-08,0,16
2016-08-09,16,4
2016-08-10,20,0
2016-08-11,23,0
2016-08-12,13,0
2016-08-13,2,4
2016-08-14,0,10
2016-08-15,0,13
2016-08-16,11,9

Why does this happen, and what can I do to ensure that the correct association is being made?

Thank you and best regards,

Andrew

Tags (1)
0 Karma
1 Solution

andrewtrobec
Motivator

In one of my other questions I was given the suggestion by @rjthibod to work with the chart and append functions (I had since found a solution using join which I was told isn't the most efficient way of doing things). The following code works perfectly:

index=my_index
| chart count as EnterCount over _time span=1d
| append 
 [ 
  search index=my_index
  | eval _time = strptime(Exit, "%Y-%m-%d %H:%M:%S")
  | chart count as ExitCount over _time span=1d ]
 ]
| timechart span=1d latest(*) as *
| fillnull value=0

Thanks to @DalJeanis @woodcock and @niketnilay for their valuable input, I very much appreciate it.

View solution in original post

0 Karma

andrewtrobec
Motivator

In one of my other questions I was given the suggestion by @rjthibod to work with the chart and append functions (I had since found a solution using join which I was told isn't the most efficient way of doing things). The following code works perfectly:

index=my_index
| chart count as EnterCount over _time span=1d
| append 
 [ 
  search index=my_index
  | eval _time = strptime(Exit, "%Y-%m-%d %H:%M:%S")
  | chart count as ExitCount over _time span=1d ]
 ]
| timechart span=1d latest(*) as *
| fillnull value=0

Thanks to @DalJeanis @woodcock and @niketnilay for their valuable input, I very much appreciate it.

0 Karma

woodcock
Esteemed Legend

Our solutions also avoid using append for the same reason that you should avoid join: subsearch limits. Your append-based solution will break down with appended results > 50Kish.

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

This part produces random test data, with only the _time and exit_time mattering.

| gentimes start="01/25/2017:23:00:00" end="01/27/2017:01:00:00" increment=23m 
| append[| gentimes start="01/26/2017:03:00:00" end="01/26/2017:21:00:00" increment=47m ]
| append[| gentimes start="01/26/2017:01:17:00" end="01/26/2017:23:18:00" increment=21m ]
| eval _time = starttime | table _time 
| eval rand1=random() | eval rand2=substr(rand1,1,5) 
| eval exit_time = _time + tonumber(rand2)
| table _time exit_time

This part separates the _time and exit_time into records by type, and reports on the counts

| eval Times ="enter="._time." exit=".exit_time
| table Times |  makemv Times | mvexpand Times 
| makemv delim="=" Times | eval type=mvindex(Times,0),_time=mvindex(Times,1) 
| table _time, type
| bin _time span=1d 
| chart count over _time by type

Resulting in this -

_time         enter     exit      
1/25/2017     3         0         
1/26/2017     149       85        
1/27/2017     2         69        
0 Karma

woodcock
Esteemed Legend

Do it like this:

index=my_index 
| bin _time span=1d 
| eval exit_time = strptime(Exit, "%Y-%m-%d %H:%M:%S") 
| bin exit_time span=1d | multireport
[| stats count AS EnterCount by _time]
[| stats count AS ExitCount by exit_time | rename exit_time AS _time]
| stats values(*) AS * BY _time
0 Karma

niketn
Legend

If you have same date range why would you want to do appendcols? Please provide context.

In any case if your time for Exit search is shifted by 3 days you can either do _time=(_time-3*24*60*60)
or else also try including earliest and latest dates directly in base search for both correlated search.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

andrewtrobec
Motivator

Thanks for the response!

I need appendcols because I am putting two different metrics into the same table. Since they both have the same range I can easily do this, but for some reason they are being matched incorrectly.

Regards,

Andrew

0 Karma

niketn
Legend

@andrewtobec...Did you try any of the options provided here? Since you had told time shifting by 3 days, I had suggested correcting time in inner query by 3 days... There are two other answers without use of appendcols, can you try them and confirm?

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

andrewtrobec
Motivator

Thanks for the follow-up. I have found a working solution and updated with the information. Thank you for your support!

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...