in the following situation:
... | stats sum(SumofCoreSecs) as total | eval Total = tostring(total, "commas") | table Total
This works fine up until a certain length of a number, but I recently ran across a sum value that went negative:
-9,302,873,272,376
I know that there are no negative numbers in the raw data. Is this a condition of the sum function?
I actually figured out that my problem was the length of the _raw field itself. Occasionally the data I'm pulling in has valid event-per-line events that exceed 10,000 characters, so I had to turn truncate off in the sourcetypes and it stopped chopping off my data in random places. Splunk has a default limit here of 10,000 characters. Unfortunately I lost some data in the process of learning this that can't be regenerated but I'll catch it for future iterations.
Thanks for the help!
I actually figured out that my problem was the length of the _raw field itself. Occasionally the data I'm pulling in has valid event-per-line events that exceed 10,000 characters, so I had to turn truncate off in the sourcetypes and it stopped chopping off my data in random places. Splunk has a default limit here of 10,000 characters. Unfortunately I lost some data in the process of learning this that can't be regenerated but I'll catch it for future iterations.
Thanks for the help!
Assuming that your SumofCoreSecs doesn't contain commas, i tested the sum function with much bigger values and it worked just fine.
In case if your SumofCoreSecs contain commas then use convert function to remove commas. See this link http://answers.splunk.com/answers/36792/how-to-sum-numbers-with-commas
Does the field SumofCoreSecs have commas?