1

I want to use custom UDF which are defined in my libaray. I have used the following code for that:

%spark2
import org.apache.spark.sql.functions.year

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

sqlContext.sql(s"ADD JAR /usr/hdp/current/spark-client/lib/myLib.jar")

val df = sqlContext.sql("select parse_datetime(start_timestamp, 'CET', 'yyyyMMddHHmmssSSS') AS TIMESTAMP) from temp) 

The above code complains about missing function "parse_datetime", so apparently, ADD JAR statement is not adding myLib. Is there a syntax problem or what is that I am missing. I am running this in Zeppelin.

I have also copied the myLib.jar in $SPARK_HOME/lib folder but no use.

1 Answer 1

1

I have found the solution. I have given the exact path of myLib.jar in the dependencies section of spark intertreper. It works now.

Sign up to request clarification or add additional context in comments.

2 Comments

How you have added jar to spark interpreter? Do you mean, adding jar in pom.xml or build.sbt file?
This was specific to Zeppelin. There one can add customer jars in the dependencies section of spark intertreper in Zepplin. What is that you are trying to do?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.