-
Could it have anything to do with the fact that Scala inherits Java's type erasure?Justin Niessner– Justin Niessner2010-03-24 17:55:15 +00:00Commented Mar 24, 2010 at 17:55
-
1@Justin: What does type erasure have got to do with this?missingfaktor– missingfaktor2010-03-24 17:58:23 +00:00Commented Mar 24, 2010 at 17:58
-
6Why don't you ask Jorge Ortiz why he advises against method overloading?John– John2010-03-24 18:23:26 +00:00Commented Mar 24, 2010 at 18:23
-
Not sure if it's applicable since I don't know Jorge's original intent, but: michid.wordpress.com/2008/02/08/…Justin Niessner– Justin Niessner2010-03-24 19:36:46 +00:00Commented Mar 24, 2010 at 19:36
-
13bah... bit.ly/aduyIn :'(missingfaktor– missingfaktor2010-03-26 02:21:30 +00:00Commented Mar 26, 2010 at 2:21
3 Answers
Overloading makes it a little harder to lift a method to a function:
object A {
def foo(a: Int) = 0
def foo(b: Boolean) = 0
def foo(a: Int, b: Int) = 0
val function = foo _ // fails, must use = foo(_, _) or (a: Int) => foo(a)
}
You cannot selectively import one of a set of overloaded methods.
There is a greater chance that ambiguity will arise when trying to apply implicit views to adapt the arguments to the parameter types:
scala> implicit def S2B(s: String) = !s.isEmpty
S2B: (s: String)Boolean
scala> implicit def S2I(s: String) = s.length
S2I: (s: String)Int
scala> object test { def foo(a: Int) = 0; def foo(b: Boolean) = 1; foo("") }
<console>:15: error: ambiguous reference to overloaded definition,
both method foo in object test of type (b: Boolean)Int
and method foo in object test of type (a: Int)Int
match argument types (java.lang.String)
object test { def foo(a: Int) = 0; def foo(b: Boolean) = 1; foo("") }
It can quietly render default parameters unusable:
object test {
def foo(a: Int) = 0;
def foo(a: Int, b: Int = 0) = 1
}
Individually, these reasons don't compel you to completely shun overloading. I feel like I'm missing some bigger problems.
UPDATE
The evidence is stacking up.
- It complicates the spec
- It can render implicits unsuitable for use in view bounds.
- It limits you to introduce defaults for parameters on only one of the overloaded alternatives.
- Because the arguments will be typed without an expected type, you can't pass anonymous function literals like '_.foo' as arguments to overloaded methods.
UPDATE 2
- You can't (currently) use overloaded methods in package objects.
- Applicability errors are harder to diagnose for callers of your API.
UPDATE 3
- static overload resolution can rob an API of all type safety:
scala> object O { def apply[T](ts: T*) = (); def apply(f: (String => Int)) = () }
defined object O
scala> O((i: String) => f(i)) // oops, I meant to call the second overload but someone changed the return type of `f` when I wasn't looking...
9 Comments
_.foo issue is Scala's limited type inference, not overloading. You answer the question, but some of the reasons are due to other weaknesses in Scala that could be improved. Overloading is more efficient than runtime downcasting a disjunction, or a Cartesian product of names is noisy and disconnects from a shared semantic.addIntToDouble, addDoubleToInt, i.e. a Cartesian product of names instead of static typing for every common semantic. Replacing typing with naming seems to be regressive. Java got more things correct than perhaps we recognize.The reasons that Gilad and Jason (retronym) give are all very good reasons to avoid overloading if possible. Gilad's reasons focus on why overloading is problematic in general, whereas Jason's reasons focus on why it's problematic in the context of other Scala features.
To Jason's list, I would add that overloading interacts poorly with type inference. Consider:
val x = ...
foo(x)
A change in the inferred type of x could alter which foo method gets called. The value of x need not change, just the inferred type of x, which could happen for all sorts of reasons.
For all of the reasons given (and a few more I'm sure I'm forgetting), I think method overloading should be used as sparingly as possible.
1 Comment
foo should be the same for every overload with the same number of parameters, else it was designed incorrectly. As for limiting the scope of bizarre cascade of inference changes, public methods should always declare their return types. I think this was one of the issues affecting Scala binary compatibility between versions.I think the advice is not meant for scala especially, but for OO in general (so far I know scala is supposed to be a best-of-breed between OO and functional).
Overriding is fine, it's the heart of polymorphism and is central to OO design.
Overloading on the other hand is more problematic. With method overloading it's hard to discern which method will be really invoked and it's indeed a frequently a source of confusion. There is also rarely a justification why overloading is really necessary. The problem can most of the time be solved another way and I agree that overloading is a smell.
Here is an article that explain nicely what I mean with "overloading is a source of confusion", which I think is the prime reason why it's discouraged. It's for java but I think it applies to scala as well.