6

I am following this tutorial to install hadoop in my computer. As far as I know, I have followed the instructions perfectly until source ~/.profile, but when I try to format HDFS by entering hdfs namenode -format, it gives me the following error :

ERROR: Cannot execute /usr/local/Cellar/hadoop/3.0.0/libexec/hdfs-config.sh

I tried a lot to look for the solution over the internet but didn't find a solution to it.

0

5 Answers 5

14

@BIKI I just ran into the same problem, and the Hadoop release 3.0.0 has a weird file structure that does not work with the home directory set the way you would think it should be.

I am on a MAC High Sierra OS 10.13, and installed using brew but I think you would see something similar on Ubuntu, or any UNIX-like system.

Bottom line, if you want to track down the errors, check your HADOOP_HOME in your profile (.bash_profile) and the scripts that are started when you kick off Hadoop. In my case, I have an alias set in my profile called hstart and it calls the following files:

start-dfs.sh

AND

start-yarn.sh

These files call the hdfs-config.sh file which gets lost given the home directory setting.

My Hadoop home directory was set to:

export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0

And I changed it to:

export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0/libexec

Of course you have to source your configuration profile, and in my case it was:

source .bash_profile

For me, this did the trick. Hope that helps!

Sign up to request clarification or add additional context in comments.

1 Comment

changing the HADOOP_HOME to ../libexec and source ~/.zshrc worked
3

Looks like latest version has issues with Brew. I tried directly downloading version Hadoop-2.8.1 from here.

Follow the same instructions. It is working.

Comments

2

Same issues for Hadoop 3.1.1 and above installed via Brew. HADOOP_HOME was not set up properly. Execute:

$ echo $HADOOP_HOME

And if you will see ”/usr/local/Cellar/hadoop” you have to add your spesific Hadoop version

$ export HADOOP_HOME=/usr/local/Cellar/hadoop/3.1.1

Comments

1

One of the other reasons one may get this error is due to permissions not being available for running Hadoop in localhost. We usually configure SSH to avoid typing passwords or to avoid providing root permissions to Hadoop. In a simple way, we configure Hadoop to run in non-root mode. What can be done is:

Use sudo every time you want to use Hadoop or to correctly define the SSH Key,

Example:

$ ssh-keygen -t rsa -P ""
$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

Comments

0

I added the sudo command in front of hadoop-3.3.6/bin/hdfs namenode -format and it works.

Just like this:

$ sudo hadoop-3.3.6/bin/hdfs namenode -format

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.