I'm having trouble writing a statement to count the number of columns in Spark SQL. I tried using the information schema and using the table.columns, but both don't seem to work in Spark SQL. Does anyone have any suggestions?
1 Answer
Below is couple of lines you can add to count number of columns in Spark SQL,
Pyspark Solution:
df_cont = spark.creatDataframe() // use right funtion to create dataframe based on source
print("Number of columns:"+str(len(df_cont.columns)))
Scala Solution:
val df_cont = spark.creatDataframe() // use right funtion to create dataframe based on source
val length = df_cont.columns.length
println("Number of columns:"+length)