I am following exactly this example https://github.com/rathboma/hadoop-framework-examples/tree/master/spark When I try to run, I get this message:
java.lang.ClassCastException: org.apache.spark.api.java.Optional cannot be cast to com.google.common.base.Optional
I do not how can I fix it, because I am newbie using Spark. Thanks!! Any suggestions?