3

Using Java and Apache Spark (that has been rewritten in Scala), faced with old API method (org.apache.spark.rdd.JdbcRDD constructor), that has AbstractFunction1 as it's argument:

abstract class AbstractFunction1[@scala.specialized -T1, @scala.specialized +R]() extends scala.AnyRef with scala.Function1[T1, R] {}

Because AbstractFunction1 is an abstract class, I cant use Java8 lambdas, so I decided to wrap scala.Function1 trait is the same with java.util.functions.Function but does't implements andThen and compose methods. As a result, i create thes interface:

import scala.Function1;

@FunctionalInterface
public interface Funct<T, R> extends Function1<T, R>, Serializable {

    @Override
    default <A> Function1<A, R> compose(Function1<A, T> before) {
        return null;
    }

    @Override
    default <A> Function1<T, A> andThen(Function1<R, A> g) {
        return null;
    }
}

IDE has no problems with this interface, but while compiling, a get:

[ERROR] Funct is not a functional interface
[ERROR] multiple non-overriding abstract methods found in interface Funct

Is it possible to wrap Scala's trait, that i can use lambdas for method:

void doMagic(scala.Function1<T,V> arg)

1 Answer 1

7

So you want a functional interface version of the scala function traits so you can call them with java8 lambda syntax? You don't have to do this yourself. Take a look at https://github.com/scala/scala-java8-compat . Its not quite as nice as directly using java8 lambda syntax, since you still have to do

import static scala.compat.java8.JFunction.*;
doMagic(func(x => ...));

instead of

doMagic(x => ...);

But in Scala 2.12 a big theme will be Java8 compatibility. The scala FunctionX classes themselves will be redone as SAM interfaces, so you will be able to do the latter when scala 2.12 comes out. At least that was the plan last time I checked.

Sign up to request clarification or add additional context in comments.

2 Comments

Generally, you catched my idea. JavaAPI calls to spark are wrapped to allow using lambdas. E.g. method javardd.map(Function<T,E> arg) - arg org.apache.spark.api.java.function.Function is a funct. interface. But somewhere, old api is not fully adapted and there are methods like javardd.map(scala.Function1<T,E> arg) (or derived abstract classes), and Function1 is not a FI here from the Java's point of view. Thank you for lib, but it is only case where there is such problem and I dont want use full comparability kit only for one case. That is why I wrote a custom interface.
Makes sense. Maybe you can just look at the source of scala-java8-compat to see how they do it?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.