10

I am using spark to perform some computations but want it to be submitted from java application.It works proper using when submitted using spark-submit script.Has anyone tried to do this?

Thanks.

2 Answers 2

7

Don't forget to add the fat JAR containing your code to the context.

val conf = new SparkConf()
   .setMaster(...)
   .setAppName(...)
   .setJars("/path/to/code.jar")
val sc = new SparkContext(conf)
Sign up to request clarification or add additional context in comments.

Comments

2

As long as you have a master and available worker started, you should be able to if you have the following in your java application:

String master = "spark://IP:7077"; //set IP address to that of your master
String appName = "Name of your Application Here";
SparkConf conf = new SparkConf().setAppName(appName).setMaster(master);;
JavaSparkContext sc = new JavaSparkContext(conf);

I was able to run junit tests from within IntelliJ that utilized the JavaSparkContext without having to use the spark-submit script. I am running into issues when performing actions on DataFrames though (not sure if that's related).

4 Comments

I have done the same thing but no luck.If I ran it using spark-submit then it works perfect.Have you tried to perform some transformations? @insomniak
For me it worked when providing these options: -Dspark.driver.host=<my ip> -Dspark.driver.port=50000
Can I please have a java code to run an spark application.
I solved this problem with answer provided by @Marius Soutier

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.