Getting Data In

Index Aggregate of Apache Access Logs

dswanson99
Path Finder

I have a series of servers that run apache that serve up the same url via post 99% of the time and in high volume. Indexing them individually would each up way too much of the indexing volume so currently they're excluded.

Using awk I can process the file at log rotation time and produce aggregates like this:
28/Sep/2011:11:40 count=20393 avgsize=32535 avgtime=150 maxtime=710
For a five minute interval per server.

I'd like the information more real time then waiting until the end of the week. Is there any way to do this completely within spunk (without indexing every access log entry)? Is there another way that I can cron something to run periodically to a log file that then spunk eats?

Thanks
-Doug

0 Karma

jbsplunk
Splunk Employee
Splunk Employee

I would suggest monitoring the file directly and using null Queue routing to prevent the data from being indexed. All you'd need to do would be come up with a regex to match the url that is showing up 99% of the time. Instructions for this can be found here:

http://docs.splunk.com/Documentation/Splunk/latest/Deploy/Routeandfilterdatad#Filter_event_data_and_...

To answer your question, you could write a script and put it in the cron tab if you'd like, and splunk can eat the file via a monitored stanza, but I think you'd be better off doin the null queue routing and then just using the search language to produce the output you are interested in seeing.

0 Karma

jbsplunk
Splunk Employee
Splunk Employee

you can set up a scripted input to run on a cron type schedule within splunk:

http://docs.splunk.com/Documentation/Splunk/latest/Developer/ScriptSetup

0 Karma

dswanson99
Path Finder

Thanks. I'm already filtering them to exclude them from indexing. I'm guessing there is nothing more I can do with splunk at that point (like also send them to a text file that I could process via a cron job)?

At this point I may have to spend some quality time with sed & awk to process the apache log at intervals, keep track of where I am in the file and hand feed splunk.

0 Karma
Get Updates on the Splunk Community!

More Ways To Control Your Costs With Archived Metrics | Register for Tech Talk

Tuesday, May 14, 2024  |  11AM PT / 2PM ET Register to Attend Join us for this Tech Talk and learn how to ...

.conf24 | Personalize your .conf experience with Learning Paths!

Personalize your .conf24 Experience Learning paths allow you to level up your skill sets and dive deeper ...

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...