I'm trying to run the first example from the documentation that says:
Spark runs on Java 6+ and Python 2.6+. For the Scala API, Spark 1.0.0 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).
This happens when I run the command:
./bin/run-example examples/src/main/scala/org/apache/spark/examples/SparkPi.scala 10
Exception in thread "main" java.lang.ClassNotFoundException: org.apache.spark.examples.examples/src/main/scala/org/apache/spark/examples/SparkPi.scala
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:289)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I'm using the version For Hadoop 2 (HDP2, CDH5)
already tried scala 2.9.3, 2.10.3 and 2.11
Some ideas?