I have case class
case class MyCaseClass(City : String, Extras : Map[String, String])
and user defined function which returns scala.collection.immutable.Map
def extrasUdf = spark.udf.register(
"extras_udf",
(age : Int, name : String) => Map("age" -> age.toString, "name" -> name)
)
but this breaks with Exception:
import spark.implicits._
spark.read.options(...).load(...)
.select('City, 'Age, 'Name)
.withColumn("Extras", extrasUdf('Age, 'Name))
.drop('Age)
.drop('Name)
.as[MyCaseClass]
I should use spark sql's MapType(DataTypes.StringType, DataTypes.IntegerType) but I can't find any working example...
And this works if I use scala.collection.Map but I need immutable Map