1

I have a class in scala that needs a spark context for its methods. Therefore in the class statement I used an implicit spark context and the code compiled fine:

class Test (implicit sc: SparkContext) {


}

However, when I tried to instantiate the object, I got the following error:

val inst: Test = new Test()
error: could not find implicit value for parameter sc: org.apache.spark.SparkContext

Is this the wrong way to use a spark context within a class?

3
  • It simply means there is no implicit SparkContext in a given scope. Where do you define it? Commented May 25, 2016 at 23:42
  • When I type in SparkContext or sc into the shell I get org.apache.spark.SparkContext = org.apache.spark.SparkContext@ back. Therefore I assume there is one in the global frame. Is there a reason it can't see it? / How do I make it see it? Commented May 25, 2016 at 23:47
  • Spark context you get in REPL is not implicit. Putting aside if it make sense or not you can start with something like this implicit val isc = sc. Commented May 26, 2016 at 0:06

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.