1

I installed sbt-1.3.4.msi and when trying to build a sample SparkPi.scala app, I'm getting the following error:

C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error]                   ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error]     val conf = new SparkConf().setAppName("Spark Pi")
[error]                    ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error]     val spark = new SparkContext(conf)
[error]                     ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed

The SparkPi.scala file is in C:\myapps\sbt\sparksample\project\src\main\scala (as shown in the error messages above).

What am I missing here?

The C:\myapps\sbt\sparksample\sparksample.sbt file is as follows:

name := "Spark Sample"

version := "1.0"

scalaVersion := "2.12.10"

libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
0

1 Answer 1

1

C:\myapps\sbt\sparksample\project\src\main\scala directory has SparkPi.scala file

That's the problem. You've got the Scala file(s) under project directory that's owned by sbt itself (not your sbt-managed Scala project).

Move the SparkPi.scala and other Scala files to C:\myapps\sbt\sparksample\src\main\scala.

Sign up to request clarification or add additional context in comments.

3 Comments

After above fix, it ran without error, but there is no .jar file in C:\myapps\sbt\sparksample\target. Instead, I see a file called "out" in C:\myapps\sbt\sparksample\target\streams_global\ivyConfiguration_global\streams.
The issue is fixed by changing the sbt file's line to libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
@frosty Thanks for accepting my answer! Much appreciated. As for your "fix", I'd disagree, i.e. += operator is for a single dependency while ++= is for zero, one or more dependencies. They're equivalent in your case. BTW, You don't need to look at C:\myapps\sbt\sparksample\target\streams_global directory since it's internal to sbt (not for users).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.