Skip to content
This repository has been archived by the owner on Apr 13, 2023. It is now read-only.

Type inference for objects doesn't work well for self types #4404

Open
CeylonMigrationBot opened this issue May 14, 2015 · 2 comments
Open

Comments

@CeylonMigrationBot
Copy link

[@jvasileff] I'm not sure if this type of thing is really meant to be supported, but I keep trying to skip the step of creating companion interfaces for objects now that objects have denotable types.

shared interface Self<Other> of Other
        given Other satisfies Self<Other> {}

shared object selfObject satisfies Self<\IselfObject> {}

shared void takesSelf<S>(S arg) given S satisfies Self<S> {}

shared void callTakesSelf() {
    takesSelf(selfObject);
    // Inferred type argument Basic&Self<selfObject> to
    // type parameter S of declaration takesSelf is
    // not assignable to upper bound
    // Self<Basic&Self<selfObject>> of S
}

[Migrated from ceylon/ceylon-spec#1298]

@CeylonMigrationBot
Copy link
Author

[@gavinking] Well it's still the rule that we never infer "anonymous" class types, so this is (inconveniently) correct and per-spec.

@CeylonMigrationBot
Copy link
Author

[@RossTate] If y'all want, I think y'all could make any immutable reference a type and everything would still be decidable.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant