0

I am using ubuntu system and trying to run wordcount.jar program. Unfortunately, I am having the following error -

Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class WordCountExample$Map not found

I have updated the classpath as it is -

job.setJarByClass(WordCountExample.class); and jobconf.setJarByClass(WordCountExample.class);

None of this work. Don't know what is wrong. Please share your expatriate in this regards.

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;

/** * @author http://www.devinline.com */

public class WordCountExample {
/* Map class which job will use and execute it map method */
public static class Map extends
Mapper<LongWritable, Text, Text, IntWritable> {
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();

public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
String line = value.toString();
StringTokenizer tokenizer = new StringTokenizer(line);
while (tokenizer.hasMoreTokens()) {
word.set(tokenizer.nextToken());
context.write(word, one);
}}}

/* Reduce class which job will use and execute it reduce method */
public static class Reduce extends
Reducer<Text, IntWritable, Text, IntWritable> {

public void reduce(Text key, Iterable<IntWritable> values,
Context context) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
context.write(key, new IntWritable(sum));
}}

public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();

/* Created a job with name wordCountExample */
Job job = Job.getInstance(conf, "wordCountExample");
job.setJarByClass(WordCountExample.class);  

/*
* Handler string and int in hadoop way: for string hadoop uses Text
* class and for int uses IntWritable */
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);

/*
 * Configure map and reducer class, based on which it uses map and
    /* reduce method
     */
job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);

    /* Input and output format set as TextInputFormat */
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);

    /* addInputPath - passes input file path to job */
FileInputFormat.addInputPath(job, new Path(args[0]));
    /* setOutputPath - passes output path to job */
FileOutputFormat.setOutputPath(job, new Path(args[1]));

    /* Submit the job to the cluster and wait for it to finish. */
System.exit(job.waitForCompletion(true) ? 1 : 0);
}}

hadoop jar /home/nahmed/WordcountSample.jar WordCountExample /user/nahmed/pg20417.txt /user/nahmed/WCoutput 19/08/15 21:31:12 INFO client.RMProxy: Connecting to ResourceManager at it066431.massey.ac.nz/130.123.248.83:8050

19/08/15 21:31:12 INFO client.AHSProxy: Connecting to Application History server at it066431.massey.ac.nz/130.123.248.83:10200 19/08/15 21:31:12 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 19/08/15 21:31:12 WARN mapreduce.JobResourceUploader: No job jar file set. User classes may not be found. See Job or Job#setJar(String). 19/08/15 21:31:12 INFO input.FileInputFormat: Total input paths to process : 1 19/08/15 21:31:12 INFO mapreduce.JobSubmitter: number of splits:1 19/08/15 21:31:12 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1562128011754_0026 19/08/15 21:31:13 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources. 19/08/15 21:31:13 INFO impl.YarnClientImpl: Submitted application application_1562128011754_0026 19/08/15 21:31:13 INFO mapreduce.Job: The url to track the job: http://it066431.massey.ac.nz:8088/proxy/application_1562128011754_0026/ 19/08/15 21:31:13 INFO mapreduce.Job: Running job: job_1562128011754_0026 19/08/15 21:31:17 INFO mapreduce.Job: Job job_1562128011754_0026 running in uber mode : false 19/08/15 21:31:17 INFO mapreduce.Job: map 0% reduce 0% 19/08/15 21:31:20 INFO mapreduce.Job: Task Id : attempt_1562128011754_0026_m_000000_0, Status : FAILED Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class WordCountExample$Map not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2228) at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164) Caused by: java.lang.ClassNotFoundException: Class WordCountExample$Map not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2134) at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2226) ... 8 more

Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143

19/08/15 21:31:23 INFO mapreduce.Job: Task Id : attempt_1562128011754_0026_m_000000_1, Status : FAILED Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class WordCountExample$Map not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2228) at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164) Caused by: java.lang.ClassNotFoundException: Class WordCountExample$Map not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2134) at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2226) ... 8 more

Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143

19/08/15 21:31:26 INFO mapreduce.Job: Task Id : attempt_1562128011754_0026_m_000000_2, Status : FAILED Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class WordCountExample$Map not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2228) at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164) Caused by: java.lang.ClassNotFoundException: Class WordCountExample$Map not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2134) at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2226) ... 8 more

Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143

19/08/15 21:31:31 INFO mapreduce.Job: map 100% reduce 100% 19/08/15 21:31:31 INFO mapreduce.Job: Job job_1562128011754_0026 failed with state FAILED due to: Task failed task_1562128011754_0026_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0

19/08/15 21:31:31 INFO mapreduce.Job: Counters: 13 Job Counters Failed map tasks=4 Killed reduce tasks=1 Launched map tasks=4 Other local map tasks=3 Data-local map tasks=1 Total time spent by all maps in occupied slots (ms)=53207 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=7601 Total time spent by all reduce tasks (ms)=0 Total vcore-milliseconds taken by all map tasks=7601 Total vcore-milliseconds taken by all reduce tasks=0 Total megabyte-milliseconds taken by all map tasks=54483968 Total megabyte-milliseconds taken by all reduce tasks=0

1 Answer 1

0

You have two nested classes: Map and Reduce. The error says that java was not able to find find them. Most probably the jar /home/nahmed/WordcountSample.jar doesn't contains them. Verify that executing:

 jar -tf  /home/nahmed/WordcountSample.jar

If they were not included in the jar, I think the issue the way you compile.

Sign up to request clarification or add additional context in comments.

2 Comments

please correct me if I am doing anything wrong. the following command shows me the java class. META-INF/MANIFEST.MF WordCountExample$Map.class WordCountExample$Reduce.class WordCountExample.class WordCountExample.java .classpath .project input.txt output/._SUCCESS.crc output/.part-r-00000.crc output/_SUCCESS output/part-r-00000 I am using the following command to execute the command - hadoop jar WordcountSample.jar WordCountExample /user/nahmed/pg20417.txt /user/nahmed/WCoutput
The text is not very clear. It seems like you are adding to the jar/war the output of the WC execution. How do you compile?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.