0

I tried

./spark-2.3.1-bin-hadoop2.7/bin/spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1 test.py 

in my own computer, everything is fine. But after i tried in my school's server, it has the following messages and errors. I have searched in Google for a long time and had no idea. Can anyone help me?

Ivy Default Cache set to: /home/zqwang/.ivy2/cache The jars for the packages stored in: /home/zqwang/.ivy2/jars :: loading settings :: url = jar:file:/data/opt/tmp/zqwang/spark-2.3.1-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml org.apache.spark#spark-sql-kafka-0-10_2.11 added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent-26b526c6-0535-4007-8428-e38188af5709;1.0 confs: [default] :: resolution report :: resolve 966ms :: artifacts dl 0ms :: modules in use:


| | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded|


| default | 1 | 0 | 0 | 0 || 0 | 0 |


:: problems summary :: :::: WARNINGS module not found: org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1

==== local-m2-cache: tried

file:/home/zqwang/.m2/repository/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom

-- artifact

org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:

file:/home/zqwang/.m2/repository/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar

==== local-ivy-cache: tried

/home/zqwang/.ivy2/local/org.apache.spark/spark-sql-kafka-0-10_2.11/2.3.1/ivys/ivy.xml

-- artifact

org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:

/home/zqwang/.ivy2/local/org.apache.spark/spark-sql-kafka-0-10_2.11/2.3.1/jars/spark-sql-kafka-0-10_2.11.jar

==== central: tried

https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom

-- artifact

org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:

https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar

==== spark-packages: tried

http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom

-- artifact

org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:

http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar

  ::::::::::::::::::::::::::::::::::::::::::::::

  ::          UNRESOLVED DEPENDENCIES         ::

  ::::::::::::::::::::::::::::::::::::::::::::::

  :: org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1: not found

  ::::::::::::::::::::::::::::::::::::::::::::::

:::: ERRORS Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom (java.net.ConnectException: Connection refused)

Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar (java.net.ConnectException: Connection refused)

Server access error at url http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom (java.net.ConnectException: Connection refused)

Server access error at url http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar (java.net.ConnectException: Connection refused)

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1: not found] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1303) at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:53) at org.apache.spark.deploy.SparkSubmit$.doPrepareSubmitEnvironment(SparkSubmit.scala:364) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:250) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:171) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

0

1 Answer 1

3

But after i tried in my school's server, it has the following messages and errors

Your school has a firewall preventing remote packages from being downloaded.

This link works for me, for example

Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom (java.net.ConnectException: Connection refused)

You'll need to download the Kafka jars outside of school, then use --jars flag to submit with them

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.