0

Hi please find below code and corresponding errors :Even though i have used import statements but still giving errors

import org.apache.spark.sql._

val sparkConf = new SparkConf().setAppName("new_proj")
implicit val sc = new SparkContext(sparkConf)

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
import sqlContext.implicits._

val projects = sqlContext.read.json("/part-m-00000.json")

[error] /mapr/trans.scala:25: value implicits is not a member of org.apache.spark.sql.SQLContext [error] import sqlContext.implicits._ [error] ^ [error] /mapr/ppm_trans.scala:28: value read is not a member of org.apache.spark.sql.SQLContext [error] val projects = sqlContext.read.json("/mapr//part-m-00000.json")

1 Answer 1

2

I'm able to compile code by changing the following lines in build.sbt :

libraryDependencies ++= Seq(
  "org.apache.spark"  % "spark-core_2.10"              % "1.4.0" % "provided",
  "org.apache.spark"  % "spark-sql_2.10"               % "1.4.0",
  "org.apache.spark"  % "spark-sql_2.10"               % "1.4.0",
  "org.apache.spark"  % "spark-mllib_2.10"             % "1.4.0"
  )
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.