I set up the following after reading many answers here on this subject. I have csv input and so I did this to override auto sourcetyping.
props.conf
[source::C:\Documents and Settings\Sample\]
sourcetype = mytype
priority = 100
KV_MODE = none
CHECK_FOR_HEADER = false
REPORT-alarmtest = alarmtest
inputs.conf
[monitor:://C:\Documents and Settings\Sample\]
sourcetype = mytype
Appropriate transforms.conf is set and btool output looks fine with all my fields. When I test sourcetype (test sourcetype filename) I see that my props.conf is not taken:
Attr:REPORT-AutoHeader AutoHeader-2
Attr:sourcetype csv-3
What else can I do/test?
Wow, will the fun never end? Sorry pjmenon, you seem to be running into all kinds of trouble with this fairly simply setup. I know how frustrating that can be.
Let me try posting a full set of configs for you. As much as possible, try to move any other settings out of the way (since I think your just getting started with splunk, so it should be that much stuff.) So you could ZIP up you existing files, or just move them to a "local.temp" folder or whatever.
For the purpose of a test, name your input log file something like painful*.csv
and put it in your "Sample" folder (case matters here, by the way.)
Then create the following in your \etc\system\local
folder: (Make sure you don't use that text editor that was adding \par
to all your lines, 😉 We don't want any more of that trouble.)
props.conf:
[source::...[/\\]Sample[/\\]painful*.csv]
sourcetype = painful_csv
[painful_csv]
SHOULD_LINEMERGE = False
TIME_PREFIX = ^(?:[^,]*,){5}\s*
TIME_FORMAT= %Y-%m-%d %H:%M:%S
KV_MODE = none
REPORT-fields = painful_csv-fields
transforms.conf:
[painful_csv-fields]
DELIMS = ","
FIELDS= "severity", "alm_no", "site_id", "alm_type", "rsv1", "start_time", "end_time", "duration", "rsv2"
inputs.conf:
[monitor://C:\Documents and Settings\Sample]
Then run the command line tool:
splunk test sourcetype "C:\Documents and Settings\Sample\painful.csv"
Then (if you don't mind dumping all your indexed data so far). This will unrecoverablly delete all your splunk indexed data. But your log files will be left untouched, splunk just clears out all of it's indexes and other internal stuff. (This is a helpful tool when you first start using splunk, but after you start collecting data I would never suggest this.) After stopping the splunk services, run:
splunk clean all
Now start up splunk. And go see if your events have the proper fields.
A few notes.
[/\\]
in props.conf matches both unix and windows style path separators; which I also recommend.Wow, will the fun never end? Sorry pjmenon, you seem to be running into all kinds of trouble with this fairly simply setup. I know how frustrating that can be.
Let me try posting a full set of configs for you. As much as possible, try to move any other settings out of the way (since I think your just getting started with splunk, so it should be that much stuff.) So you could ZIP up you existing files, or just move them to a "local.temp" folder or whatever.
For the purpose of a test, name your input log file something like painful*.csv
and put it in your "Sample" folder (case matters here, by the way.)
Then create the following in your \etc\system\local
folder: (Make sure you don't use that text editor that was adding \par
to all your lines, 😉 We don't want any more of that trouble.)
props.conf:
[source::...[/\\]Sample[/\\]painful*.csv]
sourcetype = painful_csv
[painful_csv]
SHOULD_LINEMERGE = False
TIME_PREFIX = ^(?:[^,]*,){5}\s*
TIME_FORMAT= %Y-%m-%d %H:%M:%S
KV_MODE = none
REPORT-fields = painful_csv-fields
transforms.conf:
[painful_csv-fields]
DELIMS = ","
FIELDS= "severity", "alm_no", "site_id", "alm_type", "rsv1", "start_time", "end_time", "duration", "rsv2"
inputs.conf:
[monitor://C:\Documents and Settings\Sample]
Then run the command line tool:
splunk test sourcetype "C:\Documents and Settings\Sample\painful.csv"
Then (if you don't mind dumping all your indexed data so far). This will unrecoverablly delete all your splunk indexed data. But your log files will be left untouched, splunk just clears out all of it's indexes and other internal stuff. (This is a helpful tool when you first start using splunk, but after you start collecting data I would never suggest this.) After stopping the splunk services, run:
splunk clean all
Now start up splunk. And go see if your events have the proper fields.
A few notes.
[/\\]
in props.conf matches both unix and windows style path separators; which I also recommend.Ever since this problem started, I wiped out the entire etc/apps/learned/local/*.conf . So there seems to be something else. I bumped up the priority after reading a couple of posts on this. It appears that the problem exists for csv file type.
(1) Weird. There doesn't seem to be any default .csv matching, so I'm not sure why you have to bump up the priority; seems like you have other props entries getting in the way. Sometimes cleaning out `etc/apps/learned/local/.confis helpful. (2) Using a source pattern to simply assign a sourcetype is used frequently. If you look at some of entries in
default/props.conf` you will see this pattern used. (I'm not sure if/where it would be in the docs)... Glad you've got a working config.
1) Priority SHOULS be set to => 100 for CSV types in props.conf
2) Lowell's suggestion of [/\] should be used for this to work. My full path does not work
3) Sourcetype = type should be followed by the line [type] . Does not work otherwise. I did nt see this in the documentation. Did I miss it?
Everthing else is not a factor. Finally!! 🙂
Did everything above and still did not work [autoheader] was added. So just added one line in props.conf :priority = 101
Voila! It worked!!! Then added and took of sti=uff from my configuration to see where the problem is/was. See next comment for all conditions. Thank you so much Lowell. It is such a relief...
Try changing your props.conf as such:
[source::C:\Documents and Settings\Sample\*.csv]
You can find the specifications for props.conf here: http://www.splunk.com/base/Documentation/4.1.3/Admin/Propsconf
ftk - Thaks much for taking time to debug with me. See below for what the issue was.
@pjmenon -- would you mind posting your transforms.conf?
I have a custome field extraction in transforms.conf and the associated stanza in props.conf. This props.conf is not picked up (overridden) and so my field extraction is not working. My whole idea o changing these conf files was to enforce my field extraction.
@pjmenon -- If the system picks up the files with the correct sourcetype at index time, I'm not quite sure what the problem is that you're having?
I am running this as CLI. But I do restart splunk when I check things in the web interface. yes, the monitor picks up the right sourcetype. [monitor:://C:\Documents and Settings\Sample]
_rcvbuf = 1572864
evt_dc_name =
evt_dns_name =
host = S101401
index = default
sourcetype = mytype
@pjmenon -- additionally I am not sure if test sourcetype will act on source:: fields. When your monitor picks up your files does it assign the correct sourcetype?
@pjmenon -- I assume you did restart splunk or run a | extract reload=true to make sure the new configs are applied?
correction: with *.csv at the end, same problem. When I create a new extension like "mytype" and test, the Attr:REPORT- field is missing but sourcetype is good (Attr:sourcetype mytype)
Either way, it is not taking my props.conf
Tried that too. completly got rid of csv and crated a new extension. Still the same problem.