It's possible to register UDF with code, in the context, before using sql api.
Spark proposes a command line tool: spark-sql to submit some SQL requests.
This tool use spark-submit with --class org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.
It's not possible to register a UDF before using spark-sql, but it's possible to add some jar or py-files.
What are the ways to use spark-sql with some registered functions?