2

I am new to Hadoop and I was trying to install Single node standalone Hadoop in Ubuntu 14.04. I was following the Apache Hadoop Document and as it is given there, when I tried to run

$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.0.jar grep input output 'dfs[a-z.]+'

I got the java.net.ConnectException message:

Call From a1409User/127.0.0.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

I checked in http://wiki.apache.org/hadoop/ConnectionRefused where it has been asked to verify that there isn't an entry for hostname mapped to 127.0.0.1 or 127.0.1.1 in /etc/hosts. Though this point is not so clear to me, I tried by changing the given IP and mentioning the port number but no luck. I also checked with telnet:

$ telnet localhost 9000
Trying 127.0.0.1...
telnet: Unable to connect to remote host: Connection refused

Please help me to solve the issue.

7
  • Did you start the cluster with start-all.sh? Commented Sep 7, 2014 at 18:03
  • Yes I did. Also I checked the services by jps command Commented Sep 8, 2014 at 5:46
  • which hadoop version are you trying to install? please use the document of the same version Commented Sep 8, 2014 at 7:20
  • Hadoop version 2.5.0. I am using the doc of the same version (link given in question) Commented Sep 8, 2014 at 14:14
  • what is the output of jps? try using netstat to see if it is listening at 9000. also try setting up pseudo distributed as i did Commented Sep 8, 2014 at 14:23

3 Answers 3

1

Try to format namenode. Also in your script input and output directories must be given. For example:

hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.0.jar grep /user/hadoop/hadoop-config /user/hadoop/output 'dfs[a-z.]+'

After that you can check the content in the output directory by:

hdfs dfs -ls /user/hadoop/output/

It should print output as follows:

Found 2 items -rw-r--r-- 3 hadoop supergroup 0 2014-09-05 07:55 /user/hadoop/output/_SUCCESS -rw-r--r-- 3 hadoop supergroup 179 2014-09-05 07:55 /user/hadoop/output/part-r-00000

Sign up to request clarification or add additional context in comments.

Comments

0
  1. Confirm you are in Local (Standalone) Mode. I think you are not in the Standalone mode.

May be you have tried the another step. Make sure you are not config the etc/hadoop/core-site.xml and etc/hadoop/hdfs-site.xml.

  1. If you are want to try Pseudo-Distributed Mode .

Try config the etc/hadoop/core-site.xml and etc/hadoop/hdfs-site.xml again.

Comments

0

Make sure HDFS is online. Start it by $HADOOP_HOME/sbin/start-dfs.sh

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.