Hi,
Here is my situation (and I know it isn't ideal, but I have to work with it for now)
I have scripts that pre-process log files to a standard format that Splunk digests. The format isn't really that important here, but the problem is that I have to add two more fields to this file that is created.
How do I handle this change in the format of my input file with the setup that I currently have?
I want the old data to remain unchanged and also perhaps add a default value for the 2 new fields ?
inputs.conf
[monitor:///var/log/error-monitoring]
followTail = 0
sourcetype = psv
host =
host_regex = ([^/.]+).
index = test-index
props.conf
[psv]
REPORT-PSV = psv-delim
pulldown_type = 1
NO_BINARY_CHECK = 1
SHOULD_LINEMERGE = false
TIME_FORMAT = %Y-%m-%d %H:%M:%S,%3N
transforms.conf
[psv-delim]
DELIMS = "|"
FIELDS = server,service,date,type,requestId,class,message
....
So essentially my new input file would have server,service,date,type,requestId,class,message,user,id
Halp.
You should just be able to add the two fields in transforms.conf without impacting existing events that don't contain data for those fields.
To get default values assigned to those two fields for older events, you can setup two calculated fields for your sourcetype, like so:
fieldname: user
eval: if(isNull( user, "defaultUser", user ))
Do the same thing for your default id.
BTW, I'd recommend picking a more descriptive name for your sourcetype. PSV describes the format, not so much the content. Maybe something like [ErrorMonitoring]. In most cases, that makes it easier for users having an idea what the data is that they are looking at. Just a thought.