Hi,
I use a csv file as a lookup in a search command like this :
sourcetype="airmantool" | rex ".\s(?[A-Z]+)\s+[(?\w+)]|(?.)" | sort _time | lookup AirmanTool_Lookup.csv AirmanTool_message AS AirmanTool_message OUTPUT Commentary AS AirmanTool_Explanation Procedure AS AirmanTool_Procedure
The csv file is currently on the local server (in the lookup field) but I want to use a csv on a distant server not on local. The csv is frequently updated.
How can i do ?
Agree with @changux, if you can find a way to periodically move the csv to the local server in an automated fashion, that would make life easier.
I had a very similar issue where I had a powershell script, which ran daily, creating a csv file on a windows server which was also configured as a heavy forwarder. Splunk monitored the output folder and read in each new file to send to one of the index servers. The search head then performed a scheduled search and created a new local lookup file using outputcsv.
This turned out to be inconsistent. Far simpler to have the powershell script save the output directly to the search head lookup folder (multiple ways to do this).
Hi. Check this related answer:
http://answers.splunk.com/answers/124999/get-data-lookup-from-other-remote-peer.html
My opinion: i prefer do a crontab to copy the remote file to local every X minutes.