2

I tried to create variable/val with name "org" in spark shell(Scala version 2.10.5) , but throwing error

I tried with both var and val.

 var org = List(1)

error: value apache is not a member of List[Int] org.apache.spark.sql.catalyst.encoders.OuterScopes.addOuterScope(this)

3
  • Is this your whole script? It looks like somewhere below you try to import org.apache..... and your local variable org broke that. Commented Jul 19, 2019 at 7:54
  • @Thilo , i just started spark-shell and then executed var org = List(1). Commented Jul 19, 2019 at 8:02
  • 1
    Well, probably a bug in spark-shell. It seems that it wants to call org.apache.spark.sql.catalyst.encoders.OuterScopes.addOuterScope internally. Just call the variable something else. Commented Jul 19, 2019 at 8:20

1 Answer 1

0

Well, I am able to create the list, dataframes with the name 'org'

scala> val org = List(1)
org: List[Int] = List(1)

scala> var org = List(1)
org: List[Int] = List(1)

scala>

scala>

scala> val org = Seq((1,2,3), (2,3,4)).toDF()
org: org.apache.spark.sql.DataFrame = [_1: int, _2: int ... 1 more field]

scala> org
res0: org.apache.spark.sql.DataFrame = [_1: int, _2: int ... 1 more field]

scala> org.show
+---+---+---+
| _1| _2| _3|
+---+---+---+
|  1|  2|  3|
|  2|  3|  4|
+---+---+---+

I am using spark version 2.2.0 & scala 2.11.8. I think this issue might be rectified in latest versions. But I am aware that you cannot create values or variables using some reserved keywords . For example

scala> val new = Seq((1,2)(2,3)).toDF()
<console>:1: error: illegal start of simple pattern
val new = Seq((1,2)(2,3)).toDF()

'org' might be a keyword in previous version. Hope that helps ...!!

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.