0

I'm building Apache spark source code in ubuntu 14.04.4 (spark version: 1.6.0 with Scala code runner version 2.10.4) with command

sudo sbt/sbt assembly

and getting the following error,

[warn] def deleteRecursively(dir: TachyonFile, client: TachyonFS) {
[warn] ^
[error]
[error] while compiling: /home/ashish/spark-apps/spark-1.6.1/core/src/main/scala/org/apache/spark/util/random/package.scala
[error] during phase: jvm
[error] library version: version 2.10.5
[error] compiler version: version 2.10.5
[error] reconstructed args: -deprecation -Xplugin:/home/ashish/.ivy2/cache/org.spark-project/genjavadoc-plugin_2.10.5/jars/genjavadoc-plugin_2.10.5-0.9-spark0.jar -feature -P:genjavadoc:out=/home/ashish/spark-apps/spark-1.6.1/core/target/java -classpath /home/ashish/spark-apps/spark-1.6.1/core/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/launcher/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/common/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/shuffle/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/unsafe/target/scala-2.10/classes:/home/ashish/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/ashish/.ivy2/cache/com.google.guava/guava/bundles/guava-14.0.1.jar:/home/ashish/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar:/home/ashish/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.8.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-......and many other jars...



[error]
[error] last tree to typer: Literal(Constant(collection.mutable.Map))
[error]
symbol: null
[error] symbol definition: null
[error]
tpe: Class(classOf[scala.collection.mutable.Map])
[error]
symbol owners:
[error] context owners: package package -> package random
[error]
[error] == Enclosing template or block ==
[error]
[error] Template( // val : in package random, tree.tpe=org.apache.spark.util.random.package.type
[error]
"java.lang.Object" // parents
[error] ValDef(
[error]
private
[error] "_"
[error]
[error]

[error] )
[error] DefDef( // def (): org.apache.spark.util.random.package.type in package random
[error]
[error] ""
[error]
[]
[error] List(Nil)
[error] // tree.tpe=org.apache.spark.util.random.package.type
[error]
Block( // tree.tpe=Unit
[error] Apply( // def (): Object in class Object, tree.tpe=Object
[error]
package.super."" // def (): Object in class Object, tree.tpe=()Object
[error] Nil
[error] )
[error] ()
[error] )
[error] )
[error] )
[error]
[error] == Expanded type of tree ==
[error]
[error] ConstantType(value = Constant(collection.mutable.Map))
[error]
[error] uncaught exception during compilation: java.io.IOException
[error] File name too long
[warn] 45 warnings found
[error] two errors found
[error] (core/compile:compile) Compilation failed
[error] Total time: 5598 s, completed 5 Apr, 2016 9:06:50 AM



Where I'm getting wrong?

0

2 Answers 2

0

You should build Spark with Maven...

download the source and run ./bin/mvn clean package

Sign up to request clarification or add additional context in comments.

Comments

-1

Probably similar to http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html

Try sudo sbt/sbt clean assembly

1 Comment

it doesn't work for me too, still getting the same error

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.