I was converting in the Spark Shell (1.6) a List of strings into an array like this:
val mapData = List("column1", "column2", "column3")
val values = array(mapData.map(col): _*)
The type of values is:
values: org.apache.spark.sql.Column = array(column1,column2,column3)
Everything fine, but when I start developing in Eclipse I got the error:
not found: value array
So I changed to this:
val values = Array(mapData.map(col): _*)
The problem I faced then was that the type of value now changed and the udf which was consuming it doesn't accept this new type:
values: Array[org.apache.spark.sql.Column] = Array(column1, column2, column3)
Why I am not able to use array() in my IDE as in the Shell (what import am I missing)? and why array produce a org.apache.spark.sql.Column without the Array[] wrapper?
Edit: The udf function:
def replaceFirstMapOfArray =
udf((p: Seq[Map[String, String]], o: Seq[Map[String, String]]) =>
{
if((null != o && null !=p)){
if ( o.size == 1 ) p
else p ++ o.drop(1)
}else{
o
}
})