0

I used spark 1.6.2 and Scala 11.8 to compile my project. The generated uber jar with dependencies is placed inside Spark Job Server (that seems to use Scala 10.4 (SCALA_VERSION=2.10.4 specified in .sh file)

There is no problem in starting the server, uploading context/ app jars. But at runtime, the following errors occur

java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror

Why do Scala 2.11 and Spark with scallop lead to "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror"? talks about using Scala 10 to compile the sources. Is it true?

Any suggestions please...

1
  • 1
    "Is it true?" Yep. Commented Oct 3, 2016 at 12:09

1 Answer 1

0

Use scala 2.10.4 to compile your project. Otherwise you need to compile spark with 11 too.

Sign up to request clarification or add additional context in comments.

4 Comments

That indeed works. Thanks. Can you please explain why? Most seem to folks agree on using 11.*
You have to compile spark also using scala 11
ok. I run a maven project to download the spark dependencies and eclipse to compile my project along with the maven dependencies. At this point, SJS jar is also added to the maven repo. Now, changing the scala compiler version from 10 to 11 should suffice? Thats precisely what I was doing. Am I missing something?
No, I meant the spark binary. In your deployment environment there will be spark binary. Those should be compiled with scala 11.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.