1

The application throws a java.lang.NoSuchMethodException

Stacktrace

DAGScheduler: Failed to run runJob at ReceiverTracker.scala:275
Exception in thread "Thread-33" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 6.0 failed 4 times, most recent failure: Lost task 0.3 in stage 6.0 (TID 77, 172.20.7.60): java.lang.NoSuchMethodException: org.apache.spark.examples.streaming.KafkaKeyDecoder.<init>(kafka.utils.VerifiableProperties)
        java.lang.Class.getConstructor0(Class.java:2810)
        java.lang.Class.getConstructor(Class.java:1718)
        org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:106)
        org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
        org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
        org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
        org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)

Seems issue has already been fixed in spark 1.1.0 as per this Link

Spark : 1.1.0
Kafka : 0.8.1.1

2
  • could you post the code as well? Commented Nov 7, 2014 at 10:48
  • I'm having same problem as you. First thing is to make that you have a constructor for KafkaKeyDecoder that has one parameter of type kafka.utils.VerifiableProperties. In my case, I have this constructor or if I'm using StringDecoder or DefaultDecoder I'm getting same exception. I'm running dse 4.6.2 with spark 1.1.0 and kafka 0.8.0. I guess that there is something in class path that creates the issue, but I did not found yet what ... Commented Apr 3, 2015 at 14:34

2 Answers 2

1

In my case, as explained into the comment, by removing library conflicts I was able to properly consume data from kafka and store it to cassandra, deploying the job into Datastax Analytics Solution. What I have found different than the open source one is that streaming_kafka jar and all scala libraries are already included into executor class path.

So I suggest the following:

  1. Make sure that you are using same version of scala compiler as spark.
  2. Make sure that kafka and streaming_kafka jars are compiled for same version.
  3. Check if scala libraries are already included in executor classpath and not include them in your package.

I have assumed that you are building a uber jar that you try to deploy.

Sign up to request clarification or add additional context in comments.

Comments

0

You are missing Kafka jar containing the method.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.