I am using spark to perform some computations but want it to be submitted from java application.It works proper using when submitted using spark-submit script.Has anyone tried to do this?
Thanks.
As long as you have a master and available worker started, you should be able to if you have the following in your java application:
String master = "spark://IP:7077"; //set IP address to that of your master
String appName = "Name of your Application Here";
SparkConf conf = new SparkConf().setAppName(appName).setMaster(master);;
JavaSparkContext sc = new JavaSparkContext(conf);
I was able to run junit tests from within IntelliJ that utilized the JavaSparkContext without having to use the spark-submit script. I am running into issues when performing actions on DataFrames though (not sure if that's related).