I have a field that is more than 10,000 characters. I updated props.conf to include
[source::log.txt]
TRUNCATE=20000
Splunk now indexes the entire event, but the content of the long field is being ignored when doing a search. For example search | eval l = len(long_field)
returns a length of 1. Where can I change the max length of a field?
Thanks
You might be hitting this limit (from limits.conf):
maxchars = <integer>
* Truncate _raw to this size and then do auto KV.
* Defaults to 10240 characters.
Shorter fields work as expected. For example, if I count the field length for all events the max length is 9996; all the fields with a known length greater than 10,000 show as a length of 1. So it is clearly being limited to 10,000 somewhere.
You might be hitting this limit (from limits.conf):
maxchars = <integer>
* Truncate _raw to this size and then do auto KV.
* Defaults to 10240 characters.
How many bytes does a character take
Thanks! That did it. I created a limits.conf file with maxchars = 20000
and it seems to be working as expected. Any known issues with increasing this value even higher? I'm seeing that some events have length > 19000.
Hey @Ayn, is there any limit for the same?
Do shorter fields with the same format work like it should? Or might this be an issue with the extraction itself?
It's a space delimited field (field=" value1 value2 value3 value4 value5 value6
..etc), so just using default Splunk extraction; nothing special is being applied to the file.
How is long_field
extracted?