Hi everyone,
i would like to create a timechart for unique users who made at least one action="purchase" a day. I need the total count of unique purchasers and the count of unique users who made their first lifetime purchase on that day.
01.01.2013 , 10000 , 2000
This means:
On 01.01.2013 10000 unique users did a purchase. Out of these 10000, 2000 unique Users did their first lifetime purchase on that day. I would be able to create something like with two different searches and write the outputs in a single excelsheet afterwards. But it would be great to have it in one single splunk search.
Thanks for help
Heinz
Hi,
the single searches i would use here and combine externally:
| timechart dc(eval(if(action="purchase", user, null())))
| stats min(eval(if(action="purchase", __time, null()))) by user | eval day=strftime(_time, "%d.%m.%Y")
A count per day could be calculated out of the second search and could be added to the first table.
Best
Heinz
Thanks a lot 🙂
Yep, _time
is the default field for bin
, and bin
is an alias for bucket
🙂
Ah ok, so it seems to have the same effect as | bucket span=1d _time |
I've got the "first_time" as timestamp in seconds. So I have to use the bin command here as well, to be able to compare both timestamps. My approach would be this now and it seems to work:
| bin span=1d first_time
| bin span=1d _time
| timechart span=1d dc(user) AS Total_Users, dc(eval(if(_time=first_time, user, null()))) AS "New Users"
... | bin span=1d | ...
After that _time
will be in unixtime (seconds since 1970) rounded down to midnight. If your first_time
is a date as a string (ie "2013-11-08") you can use strftime(_time, "%Y-%m-%d")
to convert the timestamp to the same format.
Thanks for the input. I got some questions regarding this:
What does "bin by day" mean?
And in what kind of format do I need "first day" here to compare it to _time? Unixtime?
Without thinking about precise syntax, here's an approach that should work:
Additionally, post the two single queries you would use.
could you provide sample event for this scenario