3

I'm trying to run the first example from the documentation that says:

Spark runs on Java 6+ and Python 2.6+. For the Scala API, Spark 1.0.0 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).

This happens when I run the command:

./bin/run-example examples/src/main/scala/org/apache/spark/examples/SparkPi.scala 10


Exception in thread "main" java.lang.ClassNotFoundException: org.apache.spark.examples.examples/src/main/scala/org/apache/spark/examples/SparkPi.scala
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:270)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:289)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I'm using the version For Hadoop 2 (HDP2, CDH5)

already tried scala 2.9.3, 2.10.3 and 2.11

Some ideas?

1
  • Look again at the documentation. Your command does not match it. The argument is exactly "SparkPi" Commented Jun 29, 2014 at 20:11

1 Answer 1

6

You need to specify the name of the example class, not the path to a source file; try

./bin/run-example SparkPi 10

This is described in the Running the Examples and Shell section of the Spark documentation.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks Josh, but now I have this: java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi.scala any clue?
What's the exact command that you're running? Did you try ./bin/run-example SparkPi.scala 10 with the .scala extension? Try leaving that off, since that's not part of the class name.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.