1

I have dataframe in Spark 2.2 and I want to read a column value as string.

val df1 = df.withColumn("col1" ,
      when( col("col1").isNull , col("col2") +"some_string" )

when col1 is null, I want to get string value in col2 and append my logic there.

Problem is I always get col("col2") as org.apache.spark.sql.Column. How can I convert this value into String to append my custom string?

0

2 Answers 2

2

lit and concat will do the trick. You can give and string value using lit function and using concat function you can concatenate it to the string value of the column.

import org.apache.spark.sql.functions._

df.withColumn("col1", when(col("col1").isNull,
  concat(col("col2"), lit("some_string"))))
Sign up to request clarification or add additional context in comments.

Comments

1

You can use the lit function to change the string value to Column and use the concat function.

val df1 = df.withColumn("col1" ,
      when( col("col1").isNull , concat(col("col2"), lit("some_string")))

Hope this helps! )

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.