The following example is pretty fast:
| from datamodel:rc-stats | search _time > 1519966560 _time <= 1519970160 | stats count
Why the next two similar queries are so slow? The slowness also occurs if we evaluate values before the search.
| from datamodel:rc-stats | search earliest = 1519966560 latest = 1519970160 | stats count as cnt
| from datamodel:rc-stats | search _time > -4h@h _time <= -3h@h | stats count as cnt
We need to calculate the time ranges before doing the search on the datamodel but if we use anything that is not a numeric value, the search does a full scan which is pretty slow.
Is there a way to do a search using calculated time ranges that is as fast as the first query?
Example: (the real query uses more complex ranges)
| from datamodel:rc-stats
| eval now = now()
| eval lastFourHours = now - (4 * 3600)
| search _time >= lastFourHours
| ...
Try this tstats approach
| tstats summariesonly=true count from datamodelName WHERE _time > whateverInEpoch AND _time < whateverInEpoch
Read this documentation on what the tstats options do. If it’s not accelerated for example then summariesonly=true won’t work.
https://docs.splunk.com/Documentation/Splunk/7.1.1/SearchReference/Tstats
It is not possible to calculate the ranges before executing the command tstats: "Error in 'tstats' command: This command must be the first command of a search."
| eval now = now()
| eval lastFourHours = now - (4 * 3600)
| eval yesterdayStart = lastFourHours - 86400
| eval yesterdayEnd = now - 86400
| tstats summariesonly=true count from rc-stats WHERE (_time >= yesterdayStart) AND (_time <= yesterdayEnd)
That’s why I have a pipe before tstats
| tstats ...
i see what you’re saying.
You could use the map command
| eval now = now()
| eval lastFourHours = now - (4 * 3600)
| eval yesterdayStart = lastFourHours - 86400
| eval yesterdayEnd = now - 86400
| map search=[| tstats summariesonly=true count from rc-stats WHERE (_time >= $yesterdayStart$) AND (_time <= $yesterdayEnd$)]