Hi All,
Here is a query which returns me DATA size and Index size based on domain.
index=db_connect sourcetype=db_metric |join TABLE_NAME [|inputlookup Domain_Module_List.csv |search (Domain ="Inventory")] |eval DATA_MB =round(DATA_KB/1024,2) |eval INDEX_MB = round(INDEX_SIZE_KB/1024,2) |timechart span=1mon sum(DATA_MB),sum(INDEX_MB) by Domain limit=25
I tried to do a sum of DATA and INDEX by rewriting the above query as
index=db_connect sourcetype=db_metric |join TABLE_NAME [|inputlookup Domain_Module_List.csv |search (Domain ="Inventory")] |eval DATA_MB =(DATA_KB/1024) |eval INDEX_MB = (INDEX_SIZE_KB/1024) |eval SIZE = round((DATA_MB + INDEX_MB),2) |timechart span=1mon sum(SIZE) by Domain
But this doesn't result correct value when validated. I think i am wrong somewhere in the syntax of summing up query.
How do I sum both DATA and Index together and display the result ?
Thanks
Sangeetha
your original query gives you the expected results? would this work for you, then?
index=db_connect sourcetype=db_metric |join TABLE_NAME [|inputlookup Domain_Module_List.csv |search (Domain ="Inventory")] |eval DATA_MB =round(DATA_KB/1024,2) |eval INDEX_MB = round(INDEX_SIZE_KB/1024,2) |timechart span=1mon limit=25 sum(DATA_MB) as datamb,sum(INDEX_MB) as indexmb by Domain|foreach indexmb* datamb* [eval size<<MATCHSTR>>='datamd<<MATCHSTR>>'+'indexmd<<MATCHSTR>>']|fields - datamd* indexmd*
your original query gives you the expected results? would this work for you, then?
index=db_connect sourcetype=db_metric |join TABLE_NAME [|inputlookup Domain_Module_List.csv |search (Domain ="Inventory")] |eval DATA_MB =round(DATA_KB/1024,2) |eval INDEX_MB = round(INDEX_SIZE_KB/1024,2) |timechart span=1mon limit=25 sum(DATA_MB) as datamb,sum(INDEX_MB) as indexmb by Domain|foreach indexmb* datamb* [eval size<<MATCHSTR>>='datamd<<MATCHSTR>>'+'indexmd<<MATCHSTR>>']|fields - datamd* indexmd*
Yes it does work. Thanks a lot.
First, unrelated Why are you doing join on inputlookup? That feels rather odd, as opposed to say just using the lookup command (maybe with an eval prior to insert a needed input field? )
Second I suspect that sum may not be the timechart function you want? I would think metric of data sizes would be reported as a gauge metric, so each sampling period would report current total consumption as opposed current consumption each sampling period, in which case performing sum over time increases the value by a factor of your sample period per Table... You may need to be looking at smaller windows and min/max/stddev metrics an then aggregate across this for larger windows. Can you share more info on what your source data means and c
My lookup files contains the tables which belong to domain - this is not captured anywhere in the logs. So I am fetching the domain from the lookup and file the datasize of table from the logs. The splunk indexed log will contain table level data size and index size