I am working with Talend open Studio v5.2.
When a job fails in Talend a log file is generated in a specified location with predefined format (pipe delimited format).
eg : -
moment|pid|project|job|language|origin|status|substatus|description
2013-01-21 18:44:29|Reek96 ; Process_Name : wf_Process_Name ; Process_sk : 121212 ; Process_Run_sk : 481 ; Batch_sk : 566556|TALEND|Job_Name|java||Failed|Job execution error|ORA-00904: "ENTITY_NAMED": invalid identifier
The above log is generated when a Talend job fails.
Please note the Bold part is for pid.
Now moving one step forward i want to integrate this with Splunk.
So, is this possible?
Thanks in advance.
Regards,
Sam
Thanks a lot guys.
Currently i am checking the visibility.
Surely will have few more queries when i start implementing the same, may be in couple of days.
Regards,
Sam
There are essentially 2 main steps to perform to get the Talend log event data into Splunk :
1) Setup Splunk to monitor the directory where the log file gets written to : http://docs.splunk.com/Documentation/Splunk/latest/Data/Monitorfilesanddirectories
2) Configure field extraction based on the header row(which you'll use as the field names) and pipe delimited fields(which will be the field values) : http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileheadersatindextime
Yes, it is possible. What part of the integration are you unsure about?