I'm creating a dashboard based on the data contained in 2 files: one with alarms fired from some equipment and one with all available equipment.
My 1st attempt was to use lookups to get the data of the equipment file and correlate that with the alarms, but I noticed that since the equipment can change over time, I can no longer relate when looking at historical data.
So for my 2nd attempt I'm trying to merge the data (alarms + equipment) before indexing it. I believe that it's possible to achieve this running a python script before indexing the data, but from my searches, I'm not able to find much about it.
This is achievable with summary indexing, but I was thinking about using it as a last resort since I had a problem before because the scheduled search didn't run for some reason and I ended up with a hole in the final data.
There are a couple of options for correlating your alarm data but it all really depends on where your equipment data is coming from. One way would be to create a lookup from the external file and then compare it against your alarm data in Splunk. What you're looking for is this: http://docs.splunk.com/Documentation/Splunk/6.2.2/Knowledge/Addfieldsfromexternaldatasources
Any idea about this?
How is these two type of data correlated? Do you have some equipment_id type of primary key? If there is a primary key available, my suggestion would be to use a Time based lookup which will store the available equipment_id based on time and you should be able to do a lookup to correlate historical data as well.
See this