I try to create a simple program using Spark in Java and I get this error:
Error:(10, 57) java: incompatible types: org.apache.spark.SparkConf cannot be converted to org.apache.spark.SparkContext
My code:
package com.example.lab;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
public class Lab {
public static void main(String[] args) {
SparkConf config = new SparkConf();
config.setMaster("local[*]").setAppName("Lab");
JavaSparkContext jsc = new JavaSparkContext(config);
}
}
I have a Windows 8.1 PC, running Java 1.8 and Spark v 2.3.0.
Why do I get this error?
SparkContext jsc = config;Maybe you have some stale sources caches in your IDE?