Knowledge Management

How to create a summary index to use for a search scheduled every 15 minutes, and return events from the last 24 hours?

prashanthberam
Explorer

I have created one summary index for a scheduled search that runs every 15 minutes, but I did not specify any time range while creating in the report, and am getting every result from starting in the summary index while retrieving the results. How can I only return the events from the last 24 hours?Can anyone help me?

Thanks in advance.

0 Karma
1 Solution

niketn
Legend

Several things around your query:

1) First the fix for issue. Similar to StartTime, use the following to create _time field as the same will be used as timestamp in Summary.

| eval _time=startTime

PS: Since your query runs every 15 minutes in place of the above you can also set this time to start time of the 15 minutes window that the query is running in.

While the above should fix your issue you should try improving performance/usage by
2) Creating knowledge object through search time field extractions for multiple rex commands that you have defined.
3) Try to change event correlation from transaction to stats.

PS: In your existing query you have used stats to get the StartTime EndTime and ResponseTime. However, transaction computes duration on its own and sets the _time as the time of first event in the transaction. Please reevaluate your query whether it is working as expected.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

View solution in original post

0 Karma

niketn
Legend

Several things around your query:

1) First the fix for issue. Similar to StartTime, use the following to create _time field as the same will be used as timestamp in Summary.

| eval _time=startTime

PS: Since your query runs every 15 minutes in place of the above you can also set this time to start time of the 15 minutes window that the query is running in.

While the above should fix your issue you should try improving performance/usage by
2) Creating knowledge object through search time field extractions for multiple rex commands that you have defined.
3) Try to change event correlation from transaction to stats.

PS: In your existing query you have used stats to get the StartTime EndTime and ResponseTime. However, transaction computes duration on its own and sets the _time as the time of first event in the transaction. Please reevaluate your query whether it is working as expected.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

prashanthberam
Explorer

I have created one summary index. The same one I want to use summary by using index=summaryindex search_name="hhhhh" in the dashboard to generate the reports with using time range picker. But after creating the summary index I have checked the events in it from last 3 months events also having the same timestamp those timestamp is the date I have created the summary index.
and one more thing that is scheduled search for every 15 minutes, if I have 15 events after scheduled search generated it's getting 30 events here I am getting duplicates of my records. how can I achieve this 2 problems.thanks

0 Karma

arkadyz1
Builder

Are you retrieving the results with a search command? If yes, make sure you use time picker to the left of the search button.

However, I have a feeling that you do not specify time range in your summary search. If you are running the report and saving it into a summary index ("a" summary because you can have your own summary indexes, not necessarily called "summary"), make sure you fill the "earliest" and "latest" values in the Time Range section. For example, if you are running a report every 15 min, you might want to specify "-16m@m" and "-1m@m" - just keep that interval 15 min. This way you are going to collect only the data that were not previously collected.

0 Karma

prashanthberam
Explorer

manually I have tried with collect command but am getting the same problem..could you please suggest to get the previous timestamps

0 Karma

prashanthberam
Explorer
index=ccsp_test_was source="/usr/WASLogs700/cdhpws_uat3_*/cdhpws/logs/application.log" "getProcedureDetailBlueChip" OR "getProcedureDetailBlueChipResponse" 
AND "Inbound Message" OR "Outbound Message" OR "getProcedureDetailBlueChip response time returning procedure details" OR "memZipCode assigned to zipCode" 
OR "provZipCode assigned to zipCode" OR  "bnftAgrmtNbr" |rex "(?Inbound|Outbound)" |eval transfield=if(searchmatch("Inbound Message") OR searchmatch("Outbound Message"),1,0) | accum transfield | transaction transfield | rex "ID:(?.*)" 
|rex "(?m)\(?.*)"|rex "(?m)\(?.*)"|rex "(?m)\(?.*)"|rex "(?.*)" |rex "(?.*)"|rex "(?.*)" | rex "(?.*)" | rex "(?.*)" |rex "(?.*)" |rex "(?.*)"|rex "(?.*)"| rex "(?.*)"|rex "(?.*)" |rex "provZipCode assigned to zipCode:(?.*)"| rex "memZipCode assigned to zipCode:(?.*)"|stats min(_time) as startTime,max(_time) as endTime,values(info) as Info,values(ResponseTime) as responseTime,values(StatusCode) as StatusCode,values(message) as StatusMessage,values(CorpEntCd) as corpEntCd,values(costlvlpctl) as Costlvlpctl,values(CptCode) as cptCode,values(GroupNbr) as GroupNbr,values(MemZipCode) as memZipCode,values(procdchrgamt) as ProcChrgamt,values(ProvZipCode) as ProvZipCode,values(SectionNbr) as SectionNbr,values(ServiceDate) as ServiceDate,values(tretcatcd) as TretCatCd,values(tretcatname) as TretCatName,values(bnftAgrmtNbr) as bnftAgrmtNbr,values(acctNbr) as acctNbr,values(provassignZip) as provassignZip,values(memzipassignzip) as memzipassignzip by id,source
|eval responseTime=endTime-startTime|eval StartTime=strftime(startTime,"%Y-%m-%d %H:%M:%S,%3N")|eval EndTime=strftime(endTime,"%Y-%m-%d %H:%M:%S,%3N")
|table id,Info,StartTime,EndTime,responseTime,StatusCode,StatusMessage,source,corpEntCd,Costlvlpctl,cptCode,GroupNbr,memZipCode,ProcChrgamt,ProvZipCode,SectionNbr,ServiceDate,TretCatCd,TretCatName,bnftAgrmtNbr,acctNbr,provassignZip,memzipassignzip

This is the search I have used to generate the summary report. I haven't included the collect command ..could you please tell me where I have to include this?

0 Karma

arkadyz1
Builder

OK, so you are not removing any fields in your search, which means your _time should be kept intact (losing milliseconds - this is a feature of summary searches) - and I see that you are getting all kinds of stats from _time, so I'll assume that your events have some legitimate timestamps.
Can you make a timechart of your original data - something like this:

index=ccsp_test_was source="/usr/WASLogs700/cdhpws_uat3_/cdhpws/logs/application.log" "getProcedureDetailBlueChip" OR "getProcedureDetailBlueChipResponse" 
AND "Inbound Message" OR "Outbound Message" OR "getProcedureDetailBlueChip response time returning procedure details" OR "memZipCode assigned to zipCode" 
OR "provZipCode assigned to zipCode" OR "bnftAgrmtNbr" | timechart bins=100 count

and tell us what you see. And do the same with your summary data:

index=summaryindex search="<your search name>" | timechart bins=100 count

This will visualize the time distribution for you. If you see all events near one time, there is something wrong with how you extract timestamps from your original data.

0 Karma

prashanthberam
Explorer

after creating the summary index. whatever the events were generated having the same timestamp (starting summary index date).so while retrieving the events in dashboard,suppose if i select time range picker for 15 min. am getting the whole data.

0 Karma

arkadyz1
Builder

Did your initial data have timestamps? It sounds like either you stripped that _time field from them while summarizing or they never had it in the first place.

Please send the search string of your summary-generating report, or, if you did it manually with collect, send that search string including the collect command.

0 Karma

inventsekar
Ultra Champion

Its bit confusing, Please provide some more details please

0 Karma

tkomatsubara_sp
Splunk Employee
Splunk Employee

Yes. Please provide your screenshot and some SPL.

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...