I am attempting to connect to a standalone spark server from a java application using the following code
SparkConf sparkConf_new = new SparkConf()
.setAppName("Example Spark App")
.setMaster("spark://my.server.com:7077");
JavaSparkContext sparkContext = new JavaSparkContext(sparkConf_new);
JavaRDD<String> stringJavaRDD = sparkContext.textFile("hdfs://cluster/my/path/test.csv");
out.println("Number of lines in file = " + stringJavaRDD.count());
I am receiving the following error
An exception occurred at line 12
12: SparkConf sparkConf_new = new SparkConf()
13: .setAppName("Example Spark App")
14: .setMaster("spark://my.server.com:7077");
15: JavaSparkContext sparkContext = new JavaSparkContext(sparkConf_new);
16: JavaRDD<String> stringJavaRDD = sparkContext.textFile("hdfs://cluster/my/path/test.csv");
17: out.println("Number of lines in file = " + stringJavaRDD.count());
java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.util.Utils$
at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:53)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:123)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:54)
Included are:
scala-library-2.10.5.jar
spark-core_2.10-1.6.0.jar
hadoop-core-1.2.1.jar
ExceptionIninitializerErrorfor theUtil$class.