We are planning to using Scala on Spark to make computations. Just want to know what is the best way to execute Scala in Spark ; Scala as Script (or) Scala as Application. Is there any advantage/disadvantage between these 2 methods?
As mentioned here it is possible to execute Scala as Script. I am trying to skip the compilation process using sbt so that I can use Scala as script just like we will use Python