1

I am trying to connect my java client to my hadoop HDFS, but i am stuck when i try to get the fs from my configuration

Configuration conf = new Configuration();
conf.set("fs.default.name", _PATH_);
conf.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
FileSystem f = FileSystem.get( conf );

Then I get this exp..

java.lang.RuntimeException: class org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback not org.apache.hadoop.security.GroupMappingServiceProvider

I googled it but noting handfull come. Any advices?

PS: I use the package hadoop-common, hadoop-hdfs from 2.0.0-cdh4.2.0

Thanks Anthony.

1 Answer 1

1

Try below Code:

Path coreSitePath = new Path("/path/of/HADOOP_HOME", "conf/core-site.xml");
conf.addResource(coreSitePath);
FileSystem fs = FileSystem.get(conf)
Sign up to request clarification or add additional context in comments.

5 Comments

thankyou for your answer but my hadoop_home is on a server, I try to run my client on my computer.
I tried your solution (i copied core-site from server to my computer) and i still get the same Exception JniBasedUnixGroupsMappingWithFallback
I tried it in the same way u have done, it worked for me Configuration conf = new Configuration(); conf.set("fs.default.name","hdfs://hostname:9000/"); FileSystem fs = FileSystem.get(conf);
just check there might be an issue with versions of jars you are using
Hello, may I ask you two question? Which version of hadoop HDFS do you use ? And where do you download the HDFS client java jar package ? I cannot find a valid hadoop HDFS client jar package for 2.7.1 hadoop HDFS.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.