1

I am following exactly this example https://github.com/rathboma/hadoop-framework-examples/tree/master/spark When I try to run, I get this message:

java.lang.ClassCastException: org.apache.spark.api.java.Optional cannot be cast to com.google.common.base.Optional

I do not how can I fix it, because I am newbie using Spark. Thanks!! Any suggestions?

1
  • Looks like you imported the wrong Optional class Commented Jun 1, 2017 at 22:11

1 Answer 1

1

This is because you use Spark 1.x to compile codes but run your application in Spark 2.x cluster. You can update pom.xml to use the same version of your Spark cluster and probably need to update your codes as well because 2.x and 1.x are not compatible.

Sign up to request clarification or add additional context in comments.

6 Comments

How do you know the cluster version?
I have installed Spark 2.1.1 using this file spark-2.1.1-bin-without-hadoop.tgz and then, I have connected Spark with the Hadoop installation setting the spark-env.sh file in /usr/local/spark/conf (I followed this: spark.apache.org/docs/latest/hadoop-provided.html). Can I change this part in pom.xml? <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>2.1.1</version> And it could be work?
@cricket_007 Print SparkContext.version
@majitux yeah, you need to change it. However, you probably also need to update codes because Spark 2.x breaks source compatibility. That's why you see ClassCastException when using different versions.
@cricket_007 Just because I happened to review the change from com.google.common.base.Optional to org.apache.spark.api.java.Optional :)
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.