Hello,
I recently created a custom search command allowing me to output results of a search directly to hdfs via webhdfs.
After some time trying to figure out the Python SDK, I've finally managed to achieve my purpose.
My command work as follow:
[splunk search] | outputhdfs [path/to/hdfs/] [fileName] [fields_to_export] [separator]
What I'd like to do is to pass a field value, such as a date or something, as my fileName, but it doesn't seem to work
like when i do :
[splunk_search] |eval datePart = strftime(s_cnxTime,"%Y%m%d") | outputhdfs path="/tmp/outputhdfs/" file=datePart fields="KeyID, ConnexionTime, Suspect, Time_Transfert" separator="#"
it creates a file named datePart at the specified path.
If you have any ideas 😉
Regards,
Wandrille
Although, I am not sure why your command is not working, I was wandering why not just use the Splunk Supported App - hadoop connect - to Export search results into HDFS?
Here is the link to the function in hadoop connect that seems to do the same as outputhdfs : http://docs.splunk.com/Documentation/HadoopConnect/1.2.3/DeployHadoopConnect/ExporttoHDFS
The command works, we didn't use the export from hadoop connect because we needed a custom command anyway, the question here only concerns the problem to pass a date as an argument of the custom command.