2

I try to create a simple program using Spark in Java and I get this error:

Error:(10, 57) java: incompatible types: org.apache.spark.SparkConf cannot be converted to org.apache.spark.SparkContext

My code:

package com.example.lab;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;

public class Lab {
    public static void main(String[] args) {

            SparkConf config = new SparkConf();
            config.setMaster("local[*]").setAppName("Lab");
            JavaSparkContext jsc = new JavaSparkContext(config);
    }
}

I have a Windows 8.1 PC, running Java 1.8 and Spark v 2.3.0.

Why do I get this error?

1
  • the signature looks correct, but the error states that you wrote something like SparkContext jsc = config; Maybe you have some stale sources caches in your IDE? Commented Mar 7, 2018 at 13:29

1 Answer 1

1

Don't use SparkContext. Since Spark 2.0, you should use SparkSession, which involves both SparkContext and SQLContext. So you should specify your configuration in this way with SparkSession:

SparkSession spark = SparkSession.builder()
  .config(config)
  .getOrCreate();

Or even simpler, you completely forget the SparkConfig object, by specifying the properties directly with the SparkSession.Builder object:

SparkSession spark = SparkSession.builder()
  .master("local[*]")
  .appName("Lab")
  .getOrCreate();

And of course, if you really want a SparkContext object, you can do it:

sparkSession.sparkContext();

Check out the javadocs for the SparkSession class.

Hope it helps! :)

Sign up to request clarification or add additional context in comments.

2 Comments

Hey Alvaro, Thanks for your reply. I typed SparkSession but I get this error: Error:(25, 13) java: cannot find symbol symbol: class SparkSession location: class com.example.lab.Lab It looks like it can't understand SparkSession.
Hey, how have you imported Spark in your project? Anyways, check it out. Take it as a reference on how to create a Spark application using the Java API. It looks like in the Java API you need to create a JavaSparkContext object from the SparkSession: JavaSparkContext sc = new JavaSparkContext(spark.sparkContext());. Please try it and tell me if it fixed the issue.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.