Getting Data In

SharePoint ULS logs

Bulluk
Path Finder

Has anyone indexed SharePoint ULS logs? I've edited my inputs.conf to index my directory but I end up with multiple sourcetypes in Splunk which looks like Server-1, Server-2, Server-3 etc. What sourcetype should I define in the inputs.conf? Below is what I have right now:

[default]
host = Server

[script://$SPLUNK_HOME\bin\scripts\splunk-perfmon.path]
disabled = 0

[monitor://C:\inetpub\logs\logfiles]
sourcetype=iis

[monitor://C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS]

Thanks in advance

0 Karma
1 Solution

Bulluk
Path Finder

It took a lot of reading but I got there in the end. If anyone else has this requirement, this is what I did:

On the forwarder

**inputs.conf**

[default]
host = UKMLWSPW102

[monitor://c:\PathToULSLogs]
SOURCETYPE=uls

**props.conf**

[uls]
CHECK_FOR_HEADER = False

On the indexer

**props.conf**

[uls]
TIME_PREFIX = :\s
MAX_TIMESTAMP_LOOKAHEAD = 128
TIME_FORMAT = %m-%d-%Y %H:%M:%S.%3N
REPORT-uls = uls
TZ = GMT

**transforms.conf**
[uls]
FIELDS="Timestamp", "Process", "TID", "Area", "Category", "EventID", "Level", "Message", "Correlation"
DELIMS = "\t"

So what this does is override the default Splunk behaviour of checking the log file header on the forwarder, meaning that the indexer receives the data "untouched". On the indexer, props.conf sets the time stamp and timezone while transforms.conf tells Splunk that the file is tab-deliminated and gives it the names of the columns.

View solution in original post

0 Karma

Bulluk
Path Finder

It took a lot of reading but I got there in the end. If anyone else has this requirement, this is what I did:

On the forwarder

**inputs.conf**

[default]
host = UKMLWSPW102

[monitor://c:\PathToULSLogs]
SOURCETYPE=uls

**props.conf**

[uls]
CHECK_FOR_HEADER = False

On the indexer

**props.conf**

[uls]
TIME_PREFIX = :\s
MAX_TIMESTAMP_LOOKAHEAD = 128
TIME_FORMAT = %m-%d-%Y %H:%M:%S.%3N
REPORT-uls = uls
TZ = GMT

**transforms.conf**
[uls]
FIELDS="Timestamp", "Process", "TID", "Area", "Category", "EventID", "Level", "Message", "Correlation"
DELIMS = "\t"

So what this does is override the default Splunk behaviour of checking the log file header on the forwarder, meaning that the indexer receives the data "untouched". On the indexer, props.conf sets the time stamp and timezone while transforms.conf tells Splunk that the file is tab-deliminated and gives it the names of the columns.

0 Karma

Bulluk
Path Finder

The best I can offer you is use the transaction command with the TID or correlation column and then search within that. Not really what you're after but I'm a novice myself

0 Karma

neilamoran
Explorer

Did you manage to get around the issues with multiline events in ULS logs? I'm kind of halfway to resolving that one, but it breaks the field extractions.

See my post at http://splunk-base.splunk.com/answers/28974/multiline-event-query-sharepoint-logs for more details.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...