2

I am newbie to Scala. I have this code:

import org.apache.spark.sql.types.{StructType, DataType, AnyDataType}
..

val test = spark.sparkContext.parallelize(Tables.value.getOrElse("db.code_tbls",collection.immutable.Map.empty[String, Any]).toList.seq).toDF("code","cde_desc").as[(String, String)]

Above line in my code is throwing java.lang.ClassNotFoundException: scala.Any error. I tried to make my test variable implicit also. But I still get the same error.

Also I tried changing Any to String in .empty[String, Any] code. But still the same error.

4
  • 1
    Could you please help in understanding your piece of code? Commented Mar 12, 2019 at 11:29
  • In my code Tables is the dataset variable which holds hive table data. If I am not able to get the data from my dataset I am using getOrElse here to get the table name and empty map which I converted into List type. Then I am converting to DF with schema as "code","cde_desc" of type String. Please let me know if I need to add more info. Commented Mar 12, 2019 at 13:42
  • Are you using Injellij? If yes then try invalidate cache and restart. Commented Mar 12, 2019 at 14:23
  • Yes. Tried but no luck. For time being I replaced collection.immutable.Map.empty[String, Any] with null. Its working now. Thanks Commented Mar 14, 2019 at 5:05

1 Answer 1

1

try collection.immutable.Map[String, Any]()

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.