All Apps and Add-ons

Can you help me find or load main class org.apache.hadoop.fs.FsShell during configure Splunk Hadoop Connect application?

yko84109
Loves-to-Learn

Hi,

I'm trying to configure the Splunk Hadoop Connect application with the following configurations:

HDFS URI: mynamenode:8020 HADOOP_HOME:
/opt/cloudera/parcels/CDH JAVA_HOME:
/usr/lib/jvm/java (tried also
/usr/java/latest) Namenode HTTP Port:
50070

And I'm got the following error:

Unable to connect to Hadoop cluster 'hdfs://mynamenode:8020/' with principal 'None': Failed to run Hadoop CLI job command '-ls' with options 'hdfs://mynamenode:8020/' Error: Could not find or load main class org.apache.hadoop.fs.FsShell.

How can I solve this?

Thanks!

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

To debug this, from the command line, can you run the commands:
which hadoop
and
which java

0 Karma

yko84109
Loves-to-Learn

Hi.
Thats wha't i'm doing, same error..

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

As you can see from this video ( https://www.youtube.com/watch?v=TmYHsabpk_Q )
When you setup the Java path and Hadoop path inside Splunk Hadoop Connect, you should not include the ' bin/hadoop or bin/java part
Can you share the output of these ' which hadoop ' and ' which java ' commands?

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...