Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix/refined subtyping #331

Merged
merged 72 commits into from
Jan 31, 2015
Merged

Conversation

odersky
Copy link
Contributor

@odersky odersky commented Jan 9, 2015

Many fixes to subtyping and constraint solving, which together constitute a big simplification. This was
prompted by the need to "do the right thing" for rfefinement types with RefinedThis self references, but
fixes there unravelled a long thread of other problems and fixes.

The main changes affect TypeComparer.scala. Because there was some going back on forth with this file it's probably best to view the changes in this file for the PR as a whole instead of looking at individual commits.

Review by @DarkDimius or @smarter

Moved working tests to pos, annotated non-working ones.
Cases handled are:

     Null <: (A & B) { ... }
     Null <: (A | B) { ... }
Plus, RefinedThis gets a second parameter, `level`. This will replace the first one in due time.
Given

    class C { type T }
    C { type U = this.T } { val x: U }

the second refinement does refer via the alias to the refinement type itself.
Capture original types in enclosing scope instead of passing them
explicitly.
Temporary solution for better diagnsitics.
When doing a findMember we now look at the level of a RefinedThis instead
of its `rt` field. This required several fixes to how levels are assigned
to RefinedThis types.
Cyclic reference was caused when compiling pos/i94-nada.scala with

    typer = new Printer

and -Yfront set.
Symptom was: When compiling Definitions.scala with the changes in the subsequent
commits, an empty tree was passed as implicit parameter, without an "implicit not found"
error being reported. The problem needed a fix in TyperState.
We will need that at some point because type checking refined types will generate skolem
variables for representing RefinedThis types if the lower type is not a singleton.
- Instead of rebasing, use the DeBrujn level of a RefiendThis.
- Make sure lower type is a singleton by skolemizing it if necessary.
- Do the correct rebinding of the upper type's RefinedThis.

Remarks:

 - The new scheme discovered quite a lot of errors which are mostly fixded in
other commits of this branch. i0268 (GADT matching) still does not work, moved to pending.

 - Some optimizations are currently missing:
   (1) fast path refined subtyping
   (2) faster operations for substituting refined thistypes which
       explot the fact that RefinedThis is relatively rare.
The previous scheme did not ensure that transitivity was eliminated.
Example scenario:

We have in the constraint

   P <: Q

for constrained variables P, Q and add

   Q <: T

Previous propagation added the constraint and then tested
whether the bounds of all variables were satisfiable. For
Q we test

    P <: T

but that is true because P <: Q and we already added the constraint Q <: T.
So we fail to add the constraint P <: T, and transitivity is no longer eliminated.

Instead we now test the new bounds (in this case P <: T) *before* adding
the new constraint Q <: T. This is also simpler than the previous scheme.
Now that new scheme works we can drop the alternative.
... into a new trait "Skolemization".
Since we cache the information whether a refinement contains RefinedThis
occurrences to the refinement itself, we can use this info to avoid substututing
RefinedThis types. Used in findMember and hasMatchingMember.

The commit uncovered an issue with constraint handling that will be fixed in the next commit.
After last commit, dotc/config died with the "isSatisfiable" assertion
in TypeComparer. The problem was that when deeling with a variable/variable
constraint

  A <: B

we treated this as the two independent actions of adding

  A <: B
  B >: A

But this means we no longer have a clean inductive satisfiability check before
something gets added - A <: B gets added before the satisfiability check of B >: A
is started. The fix splits satisfiability check and actual constraint update in
two separate actions.
As before, optimize for the case where correspnding refinements
have the same name and the lower one is a type alias.
The previous idea to keep original types around in `origN` types
to be accessed from nested methods was not needed in the end.
A configuratin now decides whether fast path is taken or not. That way
we can more easily test either way if something goes wrong.
Dropped a member calculation which was unncessessary and which would not work
anymore if RefinedThis types lost their RefinedType target field.
See comment for an example what changes.
Move non-essential code to diagnostics plus some other small cleanups.
Since the binder field in RefinedTypes should not be significant for
subtyping, we need to substitute RefinedThis types away before comparing
refinements. But this substitution is only done in the slow path.
The fix falls back to the slow path if a refinement
refers to the refined type via a refined this.
Checks are no longer interesting because we will not to migrate to
RefinedThis(level) scheme after all.
if (argument.isEmpty) f(resultType) else ViewProto(f(argument.tpe.widen), f(resultType))
if (argument.isEmpty) f(resultType) else ViewProto(f(argument.tpe/*.widen*/), f(resultType))
// !!! TODO: check performance implications
// If we do the widen, SyntheticMethods, line 66 fails to compile
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is no longer the case, all tests now pass with argument.tpe.widen

* @param boundsMap a map from PolyType to arrays.
* Each array contains twice the number of entries as there a type parameters
* in the PolyType. The first half of the array contains the type bounds that constrain the
* polytype's type parameters. The second half might contain type variables that
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why use half of an array for something and the other half for something else, instead of just using a map of PolyType to pair of arrays or two maps of PolyType to arrays? This is worth documenting.

Previous code did not recognize some cases of dependent parameters.
E.g

   P1 <: P2 | P3
   P2 <: P4
   P3 <: P4

In that case it follows that P1 <: P4
but that was not recorded in P's upper set.
Need to also ensure that the singleton is stable. This makes
compound.scala pass.
@odersky
Copy link
Contributor Author

odersky commented Jan 31, 2015

OK, going to merge this now.

odersky added a commit that referenced this pull request Jan 31, 2015
@odersky odersky merged commit a822fc1 into scala:master Jan 31, 2015
@smarter
Copy link
Member

smarter commented Jan 31, 2015

I still had two unaddressed comments on this patchset (which otherwise LGTM):

  • It'd be nice to document OrderingConstraint#init (maybe by documenting the preconditions and postconditions)
  • Why is TrackingConstraint still present in the code now that it has been replaced by OrderingConstraint?

@odersky
Copy link
Contributor Author

odersky commented Jan 31, 2015

Sorry, forgot about those outstanding comments. See #344.

On Sat, Jan 31, 2015 at 7:41 PM, Guillaume Martres <notifications@github.com

wrote:

I still had two unaddressed comments on this patchset (which otherwise
LGTM):

  • It'd be nice to document OrderingConstraint#init (maybe by
    documenting the preconditions and postconditions)
  • Why is TrackingConstraint still present in the code now that it has
    been replaced by OrderingConstraint?


Reply to this email directly or view it on GitHub
#331 (comment).

Martin Odersky
EPFL

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants