Hi All
I am trying to schedule a job that will run every day to pull data of last 30 days into a csv file for lookup.
I wrote this query in the job.
But the query is always fetching data from the entire index which include data from over 30 days causing the csv to become extremely huge.
Can someone help me identify what is wrong with the query
ResponseTime and page are extracted fields
| addinfo
| eval info_max_time="now"
| eval info_min_time="-30d@d"
| where _time >= info_min_time AND _time < info_max_time
| stats values(ResponseTime) as "resp_time" by page,_time
| table resp_time, page, _time
| outputlookup average_resp_time.csv
Thanks
I am just trying to understand. If you just need to pull data from a particular index, you need to include that right?
If you are just interested in last 30days events, you can start with
index=main earliest=-30d@d
and restrict what you need for your reports. does this help?
I am just trying to understand. If you just need to pull data from a particular index, you need to include that right?
If you are just interested in last 30days events, you can start with
index=main earliest=-30d@d
and restrict what you need for your reports. does this help?
Hmmm .. I didn't think in this line.
This helped.
Thank you very much.