1

This is a simplified version of my code that is failing to compile:

object CompileFail {
  import org.apache.spark.sql.{ Encoder, SparkSession }

  case class Foo(x: Int)

  def run(spark: SparkSession): Int => Int = {
    import spark.implicits._
    add(bar(encode[Foo]))
  }

  def bar(f: => Int): Int = 0

  def encode[A : Encoder]: Int = 0

  def add(a: Int)(b: Int) = a + b
}

It is failing with following non sensical message:

[error] Error while emitting CompileFail.scala
[error] value $u
[error] one error found
[error] Compilation failed

I am on Scala 2.12.15 and Spark 3.1.2 (but it fails on older Spark versions too).

Interesting to note:

  1. If I change add(a)(b) to add(a, b) it compiles
  2. If I change bar(f: => Int) to bar(f: Int) it compiles
  3. If I change add(bar(encode[Foo])) to add(bar(encode[String])) it compiles

What am I doing wrong?

1
  • I strongly think it has something to do with the implicit's the Encoder library brings in. Commented Feb 21, 2022 at 19:03

1 Answer 1

2
+100

I don't think you are doing anything wrong, it's a bug in scala resolution. Only doing this simple change works.

object CompileFail {
  import org.apache.spark.sql.{ Encoder, SparkSession }

  case class Foo(x: Int)

  def run(spark: SparkSession): Int => Int = {
    import spark.implicits._
    val i = encode[Foo]
    add(bar(i))
  }

  def bar(f: => Int): Int = 0

  def encode[A : Encoder]: Int = 0

  def add(a: Int)(b: Int) = a + b
}

It looks like it has conflicts with the implicit resolution and the lambda conversion of the function.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.