I am using Spark 2.2.0 and Scala 2.11.8 in the Spark-Shell environment. I have a data frame df, and I need to filter out the previous day's data based on the value of column 'date' and then append the data to a HDFS location. (e.g. today is 2018-06-28, I need the data of 2018-06-27)
Below is the code:
df.filter($"date" === "2018-06-27") .write.mode(SaveMode.Append).parquet("hdfs:/path..../date=2018-06-27")
I need the code above for automation, so I need to replace "2018-06-27" for the filter value as well as the directory name. So if I have a string -> date_test: String = 2018-06-27; The code below should be still working
df.filter($"date" === "date_test") .write.mode(SaveMode.Append).parquet("hdfs:/path..../date=date_test")
How to do this?