5

I'm writing my 1st Hadoop application and I'm getting an error. I don't quite understand what some of the detials in this stack trace mean. It's a ClassNotFoundException. I'm building this on Ubuntu Linux v12.10, Eclipse 3.8.0, Java 1.6.0_24. I installed Hadoop by downloading it off the Apache site and building it with Ant.

My crash is on the 1st line of the program when I'm creating a job.

public static void main(String[] args) throws IOException, InterruptedException,  ClassNotFoundException {

    Job job = new Job(); <<== crashing here.


Program [Java Application]  
com.sandbox.hadoop.Program at localhost:33878   
    Thread [main] (Suspended (exception ClassNotFoundException))    
        owns: Launcher$AppClassLoader  (id=29)  
        owns: Class<T> (org.apache.hadoop.security.UserGroupInformation) (id=25)    
        URLClassLoader$1.run() line: 217    
        AccessController.doPrivileged(PrivilegedExceptionAction<T>, AccessControlContext) line: not available [native method]   
        Launcher$AppClassLoader(URLClassLoader).findClass(String) line: 205 
        Launcher$AppClassLoader(ClassLoader).loadClass(String, boolean) line: 321   
        Launcher$AppClassLoader.loadClass(String, boolean) line: 294    
        Launcher$AppClassLoader(ClassLoader).loadClass(String) line: 266    
        DefaultMetricsSystem.<init>() line: 37  
        DefaultMetricsSystem.<clinit>() line: 34    
        UgiInstrumentation.create(Configuration) line: 51   
        UserGroupInformation.initialize(Configuration) line: 216    
        UserGroupInformation.ensureInitialized() line: 184  
        UserGroupInformation.isSecurityEnabled() line: 236  
        KerberosName.<clinit>() line: 79    
        UserGroupInformation.initialize(Configuration) line: 209    
        UserGroupInformation.ensureInitialized() line: 184  
        UserGroupInformation.isSecurityEnabled() line: 236  
        UserGroupInformation.getLoginUser() line: 477   
        UserGroupInformation.getCurrentUser() line: 463 
        Job(JobContext).<init>(Configuration, JobID) line: 80   
        Job.<init>(Configuration) line: 50  
        Job.<init>() line: 46   
        Program.main(String[]) line: 17 
/usr/lib/jvm/java-6-openjdk-amd64/bin/java (Jan 14, 2013 2:42:36 PM)    

Console Output:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:216)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:209)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:477)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
at org.apache.hadoop.mapreduce.JobContext.<init>(JobContext.java:80)
at org.apache.hadoop.mapreduce.Job.<init>(Job.java:50)
at org.apache.hadoop.mapreduce.Job.<init>(Job.java:46)
at com.sandbox.hadoop.Program.main(Program.java:18)

Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
... 16 more
4
  • Most likely the reason is because hadoop-core.jar could not be found. Are you running the job from Eclipse? Commented Jan 14, 2013 at 21:07
  • Yes, I'm hitting the debug button so I can step through it. I added the jar through Package Explorer -> Referenced Libraries -> Build Path (rgt clk) -> Configure Build Path -> Java Build Path -> Libraries. I hit "Add External Jar" and then found hadoop-core-1.1.1.jar and added it. Is this enough? Commented Jan 14, 2013 at 21:13
  • Yes, that should be enough. But looking again at your stacktrace I think the key reason is due to the NotClassDefFoundException of apache.commons.configuration. This jar is bundled with Hadoop (together with some more .jars). As Charles suggested try to add these jars to your building configuration (in my case the jars are located at /usr/lib/share/hadoop). Commented Jan 14, 2013 at 21:52
  • You are lacking the dependencies, I have posted dependencies on stackoverflow.com/questions/22366391/… link, you can copy from there. And use maven. Commented Sep 12, 2015 at 10:24

3 Answers 3

10

You should add all the jars found in /usr/lib/hadoop-0.xx/lib to avoid this kind of classpath issues.

To give you an idea, you can type hadoop classpath which will print you the class path needed to get the Hadoop jar and the required libraries.

In your case, you're missing the hadoop-common-0.xx.jar, so you should add this to the classpath and you should be good to go.

Sign up to request clarification or add additional context in comments.

1 Comment

Yeah, I just needed a couple of more JARs. commons.lang, commons.configuration were both missing. The package names were screwing me up a little since I'm from a .NET environment.
2

Does your main program need org.apache.commons.configuration.Configuration or should this be org.apache.hadoop.conf.Configuration?

Looks like Eclipse has auto-imported the wrong Configuration class, which isn't on the classpath when hadoop runs on your cluster.

Can you share your source code, in particular the com.sandbox.hadoop.Program class, main method?

1 Comment

Hi, I know this is a very old post. Hope you can help me! I have imported only org.apache.hadoop.conf.Configuration and want to use this only. But still I am getting the error: java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration. Why is this happening? How can i resolve this?
2

I was facing the same problem.I solved it by adding commons-configuration-x.x.jar to my build path.It is under $HADOOP_HOME/lib.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.