1

I have been referring to this link for hadoop-1.1.1 installation.

All my files and permissions have been set according to this link. But I am getting this error.Please help.

hduser@ubuntu:/usr/local/hadoop$ bin/start-all.sh mkdir: cannot create directory /usr/local/hadoop/libexec/../logs': Permission denied chown: cannot access/usr/local/hadoop/libexec/../logs': No such file or directory starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out /usr/local/hadoop/bin/hadoop-daemon.sh: line 136: /usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out: No such file or directory head: cannot open /usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out' for reading: No such file or directory localhost: mkdir: cannot create directory/usr/local/hadoop/libexec/../logs': Permission denied localhost: chown: cannot access /usr/local/hadoop/libexec/../logs': No such file or directory localhost: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136: /usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out: No such file or directory localhost: head: cannot open /usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out' for reading: No such file or directory localhost: mkdir: cannot create directory /usr/local/hadoop/libexec/../logs': Permission denied localhost: chown: cannot access/usr/local/hadoop/libexec/../logs': No such file or directory localhost: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136: /usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out: No such file or directory localhost: head: cannot open /usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out' for reading: No such file or directory mkdir: cannot create directory /usr/local/hadoop/libexec/../logs': Permission denied chown: cannot access /usr/local/hadoop/libexec/../logs': No such file or directory starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out /usr/local/hadoop/bin/hadoop-daemon.sh: line 136: /usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out: No such file or directory head: cannot open /usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out' for reading: No such file or directory localhost: mkdir: cannot create directory /usr/local/hadoop/libexec/../logs': Permission denied localhost: chown: cannot access/usr/local/hadoop/libexec/../logs': No such file or directory localhost: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136: /usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out: No such file or directory localhost: head: cannot open `/usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out' for reading: No such file or directory

3
  • Have you formatted the namenode before doing "bin/start-all.sh". Looks like your namenode is not formatted properly. Commented Feb 28, 2013 at 15:07
  • Furthermore please check the permissions to /usr/local/hadoop/log/. Can you access this directory as hduser? Commented Feb 28, 2013 at 18:18
  • stackoverflow.com/questions/11672672/… Commented Oct 27, 2014 at 4:07

3 Answers 3

1

As the error suggests you're having a permission problem. You need to give hduser proper permissions. Try:

sudo chown -R hduser /usr/local/hadoop/
Sign up to request clarification or add additional context in comments.

Comments

0

Run this command to change the permission of the hadoop directory

sudo chmod 750 /app/hadoop

Comments

0

Below are 2 very helpful suggestions:

  1. It is good to check whether HADOOP_HOME and JAVA_HOME is set in .bashrc file. Sometimes, not setting up these environment variables may also cause error while starting the hadoop cluster.

  2. It is also useful to debug the error by going through the log files generated in /usr/local/hadoop/logs directory.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.