How do I supply custom classpath to spark app which is running in cluster mode (i.e. driver submits, executors executes)
spark.driver.userClassPathFirst option leads to more classpath conflicts.
--conf spark.executor.extraClassPath=foo/bar.jar --conf spark.driver.extraClassPath=foo/bar.jar
I used this while using spark-submit I don't see this getting in effect.
Does the foo/bar.jar have to be present on executor host or it will also make it available ?