0

I have been working with apache spark(scala) and building the packages with sbt. I am able to build the package, but I keep getting a Exception in thread "main" java.net.URISyntaxException: Illegal character in path at index 0: when I do a

./bin/spark-submit \  "/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar" --help

and I don't understand why that is the case.

Here is my code

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD

object creditFraud {

def main(args: Array[String]) {

 val conf = new SparkConf().setAppName("Transaction")
 val sc = new SparkContext(conf)

 val graph = GraphLoader.edgeListFile(sc,"Users/grantherman/Desktop/transactionFile.csv")


println("GRAPHX: Number of vertices " + graph.vertices.count)
println("GRAPHX: Number of edges " + graph.edges.count)
    }
  }

Here is the .sbt file:

name := "Transaction"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.3.1" % "provided"


resolvers ++= Seq(
"Akka Repository" at "http://repo.akka.io/releases/",
"Spray Repository" at "http://repo.spray.cc/")

1 Answer 1

1

If you are enter the command in a single line, please remove "\", so it will looks like:

./bin/spark-submit "/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar" --help

The "\" is the Bash escape character. If you trying to enter a long command, you can split your command into few lines, like:

./bin/spark-submit \ 
"/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar" --help

Update: My previous answer is only focus on "java.net.URISyntaxException".

In order to run the spark-submit, you can refer to its documentation: https://spark.apache.org/docs/1.1.0/submitting-applications.html

For your case, you can execute your jar file in below commands (assumed that your class name is org.apache.spark.examples.SparkPi):

./bin/spark-submit --class org.apache.spark.examples.SparkPi "/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar"

OR split it to multiple lines:

./bin/spark-submit \ 
--class org.apache.spark.examples.SparkPi \
"/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar"

You can also specified the number of cores that you wish to run (let's say 4 cores):

./bin/spark-submit \ 
--class org.apache.spark.examples.SparkPi \
--master local[4] \
"/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar"

If you are unsure that your jar file is working fine, I will suggest you to play around with the spark-examples-[version].jar first before get started.

Sign up to request clarification or add additional context in comments.

2 Comments

So I got rid of that and I get keep getting that it can't read the main class. Do I need to add the parmeters class -- "className" and master --local[4]?
The original answer is only focus on your main error "java.net.URISyntaxException". I have updated my answer which should allow you to successfully execute the spark-submit.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.