The hosting provider is Rackspace Cloud Sites. In the root of each site is a logs dir, ex. somesite.com/logs. There are two different logs I want to grab each day:
1) ourlog_posts_YESTERDAYSDATE.csv
2) access_log_YESTERDAYSDATE.zip
Note I am unable to install a Universal Forwarder as this is essentially a shared hosting site. So I want to grab each log, each morning, and get it into Splunk to index.
I'd think someone has come up against this problem before and I'd like to hear your solution.
Assuming your cron scripts/actions can get the files off the hosting system & on to a system that splunk has access to, you could configure a local dir input on the splunk server (or forwarder, etc) & associate the desired metadata to any data that's indexed from that source. The cron actions could then just put the data there.
Assuming your cron scripts/actions can get the files off the hosting system & on to a system that splunk has access to, you could configure a local dir input on the splunk server (or forwarder, etc) & associate the desired metadata to any data that's indexed from that source. The cron actions could then just put the data there.
That's my initial thought. But thought I'd solicit other ideas too.