I'm new to Spark. I'm trying to run the following code in Spark shell:
import org.apache.spark.api.java.JavaSparkContext
import org.apache.hadoop.conf
JavaSparkContext context = new JavaSparkContext(conf);
But I'm getting the following error:
<console>:32: error: value context is not a member of object
org.apache.spark.api.java.JavaSparkContext
val $ires10 = JavaSparkContext.context
^
<console>:29: error: value context is not a member of object
org.apache.spark.api.java.JavaSparkContext
JavaSparkContext context = new JavaSparkContext(conf);
Is there any import statement that I'm missing? I also added
import org.apache.spark.api.java.JavaSparkContext._
but it still didn't work. Please help.