In Does TRUNCATE specify the ultimate size of an event? we looked at standard logging and we are good with TRUNCATE
for the max line's length and MAX_EVENTS
for max number of lines. We are trying to establish limits standards for our data and we don't know now how json type data fits into these limits.
I would also utilise this search to see if there is any issues with the ingested data in terms of truncate / line breaking along with the config in props / max length in above link
index=_internal ( warn OR error ) NOT StreamedSearch (LineBreakingProcessor OR "AggregatorMiningProcessor - Breaking event")
| rex "limit of (?<limit>\d+)"
| eval src= if(component="LineBreakingProcessor", "Line is too long - adjust truncate setting (keep eye toward line breaking)" ,"Too many lines - work on line breaking")
| stats count as events values(limit) as limit dc(data_source) as sources dc(data_host) as hosts by data_sourcetype src
| stats sum(events) as total_events list(events) as events list(src) as issue list(limit) as limit list(sources) as sources list(hosts) as hosts by data_sourcetype
| sort -total_events
If there are no parsing issues, your config should be good.
Note: I would revisit the config on a regular basis as there are times when there is a data feed which is outside the set limit, it just helps with refinement.