0

I trying to run a sample scala spark program on intellij. I have created a maven project and added scala nature to the project. I'm able to run the scala hello world program, but I'm trying to run spark-scala, its throwing the below exception.

Exception in thread "main" java.lang.VerifyError: class scala.collection.mutable.WrappedArray overrides final method toBuffer.()Lscala/collection/mutable/Buffer;
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:65)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:60)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
    at com.dnb.dsl.test.SparkDemo$.main(SparkDemo.scala:7)

Here I'm attaching the the program,

 import org.apache.spark.{SparkConf, SparkContext} 
    object SparkDemo { def main(args: Array[String]) {
     val conf = new SparkConf().setAppName("SparkDemo").setMaster("local")
     val sc = new SparkContext(conf) 
     val input = sc.parallelize(Array(1, 2, 3, 4, 5, 6, 7, 8, 9, 10))
     input.foreach(println) 
       } 
    } 

pom.xml

<properties>
        <spark.version>2.0.1</spark.version>
        <scala.version>2.11</scala.version>
    </properties>
    <dependencies>
        <dependency> <!-- Spark dependency -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_${scala.version}</artifactId>
            <version>${spark.version}</version>
        </dependency>
    </dependencies>
    <build>
        <pluginManagement>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <configuration>
                        <encoding>UTF-8</encoding>
                        <source>1.8</source>
                        <target>1.8</target>
                        <compilerArgument>-Werror</compilerArgument>
                    </configuration>
                </plugin>
            </plugins>
        </pluginManagement>
    </build> 

Attaching scala sdk verion configuration from intellij. Java version used:1.8

scala version from intellij

3
  • 1
    You will need to downgrade your Scala version from 2.13 to 2.11, or better yet, use Spark 2.4.5 and Scala 2.12. Commented Apr 18, 2020 at 2:06
  • @mazaneicha throwing another exception Exception in thread "main" java.lang.NoSuchMethodError: scala.util.matching.Regex.<init>(Ljava/lang/String;Lscala/collection/Seq;)V Commented Apr 18, 2020 at 14:12
  • 1
    stackoverflow.com/questions/43883325/… and spark.apache.org/docs/2.0.0/#downloading: Spark 2.0.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x). Commented Apr 18, 2020 at 15:05

1 Answer 1

0

I would try a different approach and as @mazaneicha has suggested downgrade your Scala version. Instead of Maven I would use SBT for Scala. IntelliJ comes fully integrated with SBT and Scala, and SBT is really easy to work with.

A build.sbt for your example would be:

name := "my-project"

version := "0.1"

scalaVersion := "2.11.10"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.0"

To find and download libraries for your spark projects you can use Maven repository:

Spark Maven repository

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.