Hi,
I am getting the below error when ever I am trying to connect hadoop cluser with splunk using splunk
hadoop connect.
Unable to connect to Hadoop cluster Failed to run Hadoop CLI job command ls: No FileSystem for scheme: hdfs
Though the below command is running perfectly fine.
/usr/lib/hadoop/bin/hadoop fs -ls hdfs://localhost:8020/
The configuration I use in the app is as below.
HDFS URI
localhost:8020
HADOOP_HOME
/usr/lib/hadoop
JAVA_HOME
/usr/lib/jvm/jdk1.8.0
Please let me know where I have gone wrong ?
I figured out what was going on with this in my case. My issue what that I was using the client package from the Cloudera Hadoop distro and that my /usr/lib/hadoop had the binaries and many libraries for hadoop, but it did not have the hadoop-hdfs jar file. Coudera has a separate directory, /usr/lib/hadoop-hdfs, which contained a jar for hdfs. It's klugey, but I copied hadoop-hdfs-2.6.0-cdh5.7.1.jar into /usr/lib/hadoop and then it worked.
Hi @Krish_Splunk
I am quite surprise that if you are using localhost as hadoop server then why are you using "localhost:8200", just simply select "Locally mounted Hadoop" and give locally mounted file system.
Thanks,
Harshil
Hi Guys,
Any Idea on this ??
In Hadoop Connect, don't forget to add the / after 8020
It doesn't accept a "/" after the 8020. It errors with "invalid stanza".