I need to register udf function without arguments. But Apache Spark have not UDF0 interface realization. I trying somethig like:
UDF1<Object, String> my_func = o -> return "some_generated_string";
sqlContext.udf().register("my_func", my_func, DataTypes.StringType);
But df.withColumns("newCol", functions.expr("concat(col1, my_funct())")); returns exception org.apache.spark.sql.UDFRegistration$$anonfun$register$25$$anonfun$apply$1 cannot be cast to scala.Function0.
So df.withColumns("newCol", functions.expr("concat(col1, my_funct(1))")); work correctly but this is wrong way and smells bad.
UDFRegistration in org.apache.spark.sql has method register[RT: TypeTag](name: String, func: Function0[RT]): UserDefinedFunction. Java see this method as register(String name, Function0<RT> func, TypeTag<RT> evidence$1). I can wrote scala.Function0 implementation, but what is TypeTag evidence$1?