13

I am using Scala 2.11, Spark, and Scallop (https://github.com/scallop/scallop). I used sbt to build an application fat jar without Spark provided dependencies (this is at: analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar)

I am able to run the program fine in sbt.

I tried to run it from the command line as follows:

time ADD_JARS=analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar java -cp /Applications/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar com.dtex.analysis.transform.GenUserSummaryView -d /Users/arun/DataSets/LME -p output -s txt -o /Users/arun/tmp/LME/LME

I get the following error message:

Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror; at org.rogach.scallop.package$.(package.scala:37) at org.rogach.scallop.package$.(package.scala) at com.dtex.analysis.transform.GenUserSummaryView$Conf.delayedEndpoint$com$dtex$analysis$transform$GenUserSummaryView$Conf$1(GenUserSummaryView.scala:27) at com.dtex.analysis.transform.GenUserSummaryView$Conf$delayedInit$body.apply(GenUserSummaryView.scala:26) at scala.Function0$class.apply$mcV$sp(Function0.scala:40) at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) at org.rogach.scallop.AfterInit$class.delayedInit(AfterInit.scala:12) at org.rogach.scallop.ScallopConf.delayedInit(ScallopConf.scala:26) at com.dtex.analysis.transform.GenUserSummaryView$Conf.(GenUserSummaryView.scala:26) at com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala:54) at com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)

1
  • Looks like your classpath lacks scala-reflect.jar Commented Jan 13, 2015 at 8:58

2 Answers 2

15

The issue is that you've used incompatible Scala versions, i.e. Spark was compiled with Scala 2.10 and you were trying to use Scala 2.11.

Move everything to Scala 2.10 version and make sure you update your SBT as well.

You may also try to compile Spark sources for Scala 2.11.7 and use it instead.

Sign up to request clarification or add additional context in comments.

Comments

1

I was also encountered with the same issue with spark-submit, in my case:

Spark Job was compiled with : Scala 2.10.8

Scala version with which Spark was compiled on the cluster: Scala 2.11.8

To check the Spark version and Scala version on the cluster use "spark-shell" command.

After compiling the Spark Job source with Scala 2.11.8 then submitted the job & it worked !!!.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.