0

How do I supply custom classpath to spark app which is running in cluster mode (i.e. driver submits, executors executes)

spark.driver.userClassPathFirst option leads to more classpath conflicts.

--conf spark.executor.extraClassPath=foo/bar.jar --conf spark.driver.extraClassPath=foo/bar.jar

I used this while using spark-submit I don't see this getting in effect.

Does the foo/bar.jar have to be present on executor host or it will also make it available ?

1 Answer 1

1

You can use --jars if you want to copy the jars to all executors by job itself. Otherwise you need to copy the jars.

Thanks Ravi

Sign up to request clarification or add additional context in comments.

2 Comments

--jars foo/bar.jar just this didn't work as expected. should it go with other flags ? thanks
turns out I had to pass both the flags, --jars to make files available and others (mentioned in question) to make them available in classpath.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.