Conversation
|
Thank you @dwijnand and @liufengyun for submitting the proposal, I have assigned a team of reviewers to it. |
odersky
left a comment
There was a problem hiding this comment.
This is a very intriguing and ambitious proposal. I have one major concern: it looks to me that pattern arguments would not be supported (happy to be shown wrong on this).
Since the proposal is very far reaching, I think it would be important to do a deeper exploration. Ideally in the form of a paper submitted to a conference like the Scala symposium.
There's also a possible connection to dependent types and refinement types here. Maybe @mbovel should take a look to see whether something clicks.
| `Zero.type`, and `Neg`, using the `ordinal` method: | ||
|
|
||
| ```scala | ||
| given TypeTest[Int, Zero.type] = (n: Int) => if ((n: Num).ordinal == 0) Some(n) else None |
There was a problem hiding this comment.
I don't think this would work for multiple sealed extensions of Int. (n: Num) has no special effect since it is an alias of (n: Int). But one could probably define a sealed-type-specific ordinal as a normal method. I.e.
def Num$ordinal(n: Int) = ...
...
given TypeTest[Int, Zero.type] = (n: Int) => if Num$ordinal(n) == 0 then Some(n) else None| 4. Each case must define a new type or singleton type | ||
|
|
||
| ## Alternative Design | ||
|
|
There was a problem hiding this comment.
I think I prefer the original design over the alternative. What's neat is that there is a single ordinal method that chooses the right alternative. That means we can just rely on normal sequential pattern matching. By contrast in the alternative design we risk ambiguity if the definitions of the individual TypeTests have overlapping conditions.
There was a problem hiding this comment.
Indeed, I've held that non-ambiguity as core to the design.
| 1. If the match on the value of the underlying type is not exhaustive, then the sealed type must be | ||
| declared `opaque`, in order to preserve the fact that the sealed type represents only a subset of | ||
| the values of the underlying type (e.g. positive integers) | ||
| 2. No other type may be declared to subtype the opaque type `T` |
There was a problem hiding this comment.
I think that restriction is tricky to implement. Why do we need it?
There was a problem hiding this comment.
Hmm, I can't remember now. @liufengyun, anything pop in mind?
There was a problem hiding this comment.
As far as I remember, suppose Pos represents positive even or odd numbers, opaque type Pos = Int { ... }, here Pos is not a partition of Int.
Having another type type S <: Pos = Int can break the abstraction --- now negative numbers can take the type Pos.
| ```scala | ||
| sealed type Num = Int { | ||
| case 0 => val Zero | ||
| case n if n > 0 => type Pos |
There was a problem hiding this comment.
This is neat, but what if I want to expose type arguments in patterns? I mean the typical general case of extractor base matching would be
selector match
case Pat1(xs1) =>
...
case PatN(xsN) =>It seems there's no way to get to the extractor specific subpatterns?
There was a problem hiding this comment.
So perhaps it's sadly too mechanic, but with the sealed type definition we're just splitting the type. Which enables case _: Pat1 =>. If you want to decompose Pat1 then you'd write a object Pat1 { def unapply(x: Pat1): (Int, String) = (x.foo, x.bar) }. Or, even more verbosely, with a non-allocating Pat1Extractor value class as a result type...
|
I see another alternative that should be explored. It's inspired by a very old paper:
The idea is that we could also encapsulate the logic in a enum IntSplit:
case Pos(x: Int)
case Neg(x: Int)
case Zero
def view(x: Int): IntSplit =
if x < 0 then Neg(x)
else if x > 0 then Pos(x)
else ZeroThen the pattern match would look like this: IntSplit.view(n) match
case IntSplit.Zero =>
case IntSplit.Pos(x) =>
case IntSplit.Neg(x) =>It's naturally exhaustive. All of this can be done with current Scala, so you might ask why is a language extension needed? The only reason I could see is we might want to avoid explicit call of the So this makes me think of defining some kind of implicit "pattern-view" conversion that is inserted when the patterns are part of a trait PatternView[T]:
def view(x: T): Anyand change the first line of the example above to enum IntSplit extends PatternView[Int]Then, if all cases of a pattern are cases from the same pattern view, wrap the corresponding view method around the selector before going into the unapply or comparison. During type checking, the view method calls would have to be
|
odersky
left a comment
There was a problem hiding this comment.
So in light of all this I'd say this needs to go back to the drawing board.
|
That design incurs a boxing cost, which is part of the starting point of this proposal. |
There was a problem hiding this comment.
All in all, as I understand, the only improvement over Scala today is the elimination of boxing and a slight reduction of boilerplate code when the user is defining arbitrary equivalence classes over values of some already defined type.
I'm still unsure if this is enough to justify increasing the complexity of the language.
Moreover, I find the syntax proposed for the declaration of sealed types confusing. I think that if we decide to go on with the proposal, we should rethink the syntax aspect.
| sealed type Num = Int { | ||
| case 0 => val Zero | ||
| case n if n > 0 => type Pos | ||
| case _ => type Neg | ||
| } |
There was a problem hiding this comment.
I suspect this syntax is very confusing. I run a very simple test. I showed the snippet above to 6 folks who have experience both in Scala 2 and Scala 3 but do not know about this proposal.
None of them could guess what the snippet is supposed to do. After revealing what it desugars to, they unanimously agreed that it would be a constant source of confusion.
Firstly, I believe that even years after it landing in the language, googling "sealed types scala" would return results that are overwhelmingly about sealed traits. Sealed traits are in language much longer, there is much more written about them, all their descriptions probably contain the word "types" somewhere, and search engines are not very good at interpreting subtle differences in queries.
Another aspect here is that the syntax looks like a mashup between refined types, pattern matching and match types. this will definitely add to the confusion.
Lastly, and this is not a fatal flaw, but case 0 => val Zero looks really weird and, in my opinion, not very scala-like. I think it can be substituted with case 0 => 0.type or even just case 0 => type Zero it would make much more sense.
|
@dwijnand and @liufengyun, are you interested in continuing your work on this proposal? |
|
Sorry for the late response @julienrf . This SIP is an exploration of the design space to extend exhaustivity check to opaque types and abstract type members. Given the helpful feedback, I think it requires more thinking in a bigger scope and in terms of cost/benefit. I'm happy to close this SIP. WDYT @dwijnand ? |
|
I'm happy to as well. |
|
For anyone interested in working in the area later, one lesson I learned is that boxing (which is a performance concern) should not play a too big role in language design. Too much concern for boxing unnecessarily restricts the design space. Backend compilers with inlining + (partial) escape analysis are pretty good at removing local boxing costs. |
This pull request has been automatically created to import proposals from scala/docs.scala-lang