0

I'm trying to run a scala application and in spark-shell this works well. But when I use spark-submit, using my class, it fails.

spark-submit --deploy-mode cluster --master yarn --class org.apache.spark.examples.SparkPi s3n://bucket/test.scala

Applicacion:

package org.apache.spark.examples

import org.apache.spark.sql.types._
import org.apache.spark.sql.SQLContext

object SparkPi {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Simple Application")
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)

    print("test")

  }
}

Error:

Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi)

1 Answer 1

1

Try to build jar using your test.scala source and provide it as an argument for spark-submit. In spark-submit you should specify a jar with your compiled code, not the source code itself.

Sign up to request clarification or add additional context in comments.

2 Comments

I compiled with SBT, no?
Yep, and got some .class files. Now pack them into a jar and use it to launch your job. According to a command that you provided you're specifying a source code .scala instead of compiled code.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.