New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tracking Issue for RFC 213: Default Type Parameter Fallback #27336

Open
jroesch opened this Issue Jul 27, 2015 · 47 comments

Comments

Projects
None yet
@jroesch
Member

jroesch commented Jul 27, 2015

This is a tracking issue for RFC 213.

The initial implementation of this feature has landed.

cc @nikomatsakis

@bluss

This comment has been minimized.

Show comment
Hide comment
@bluss

bluss Nov 20, 2015

Contributor

What is the status of this?

Contributor

bluss commented Nov 20, 2015

What is the status of this?

@pnkfelix

This comment has been minimized.

Show comment
Hide comment
@pnkfelix

pnkfelix Dec 1, 2015

Member

@bluss it is still feature-gated, AFAICT. See e.g. the discussion on PR #26870 , or just look at this playpen

I am not sure what the planned schedule is for unfeature-gating it.

nominating for discussion.

Member

pnkfelix commented Dec 1, 2015

@bluss it is still feature-gated, AFAICT. See e.g. the discussion on PR #26870 , or just look at this playpen

I am not sure what the planned schedule is for unfeature-gating it.

nominating for discussion.

@mahkoh

This comment has been minimized.

Show comment
Hide comment
@mahkoh

mahkoh Dec 11, 2015

Contributor

This doesn't seem to be working properly.

#![crate_type = "lib"]
#![feature(default_type_parameter_fallback)]

trait A<T = Self> {
    fn a(t: &T) -> Self;
}

trait B<T = Self> {
    fn b(&self) -> T;
}

impl<U, T = U> B<T> for U
    where T: A<U>
{
    fn b(&self) -> T {
        T::a(self)
    }
}

struct X(u8);

impl A for X {
    fn a(x: &X) -> X {
        X(x.0)
    }
}

fn f(x: &X) {
    x.b(); // ok
}

fn g(x: &X) {
    let x = x.b();
    x.0; // error: the type of this value must be known in this context
}
Contributor

mahkoh commented Dec 11, 2015

This doesn't seem to be working properly.

#![crate_type = "lib"]
#![feature(default_type_parameter_fallback)]

trait A<T = Self> {
    fn a(t: &T) -> Self;
}

trait B<T = Self> {
    fn b(&self) -> T;
}

impl<U, T = U> B<T> for U
    where T: A<U>
{
    fn b(&self) -> T {
        T::a(self)
    }
}

struct X(u8);

impl A for X {
    fn a(x: &X) -> X {
        X(x.0)
    }
}

fn f(x: &X) {
    x.b(); // ok
}

fn g(x: &X) {
    let x = x.b();
    x.0; // error: the type of this value must be known in this context
}
@jroesch

This comment has been minimized.

Show comment
Hide comment
@jroesch

jroesch Dec 15, 2015

Member

@mahkoh there is a necessary patch that hasn't gotten rebased since I stopped my summer internship. I've been unfortunately busy with real life stuff, looks like @nikomatsakis has plans for landing a slightly different version according to a recent post of his on the corresponding documentation issue for this feature.

Member

jroesch commented Dec 15, 2015

@mahkoh there is a necessary patch that hasn't gotten rebased since I stopped my summer internship. I've been unfortunately busy with real life stuff, looks like @nikomatsakis has plans for landing a slightly different version according to a recent post of his on the corresponding documentation issue for this feature.

@bluss

This comment has been minimized.

Show comment
Hide comment
@bluss

bluss Feb 22, 2016

Contributor

@nikomatsakis I know the lang team didn't see any future in this feature, will you put that on record in the issue 😄?

One example where this feature seems to be the only way out is the following concrete example of API evolution in libstd.

Option<T> implements PartialEq today, but we would like to extend it to PartialEq<Option<U>> where T: PartialEq<U>. It appears this feature can solve the type inference regressions that would otherwise occur (and might block us from doing this oft-requested improvement of Option).

Contributor

bluss commented Feb 22, 2016

@nikomatsakis I know the lang team didn't see any future in this feature, will you put that on record in the issue 😄?

One example where this feature seems to be the only way out is the following concrete example of API evolution in libstd.

Option<T> implements PartialEq today, but we would like to extend it to PartialEq<Option<U>> where T: PartialEq<U>. It appears this feature can solve the type inference regressions that would otherwise occur (and might block us from doing this oft-requested improvement of Option).

@nikomatsakis

This comment has been minimized.

Show comment
Hide comment
@nikomatsakis

nikomatsakis Feb 23, 2016

Contributor

@bluss I HAVE been dubious of this feature, but I'm been slowly reconsidering. @aturon is supposed to be doing some exploration of this whole space and writing up some detailed thoughts. I actually started rebasing @jroesch's dead branch to implement the desired semantics and making some progress there too, but I've been distracted.

One advantage of finishing up the impl is that it would let us experiment with extensions like the one you describe to see how backwards compatible they truly are -- one problem with fallback is that it is not ACTUALLY backwards compatible, because of the possibility of competing incompatible fallbacks.

Contributor

nikomatsakis commented Feb 23, 2016

@bluss I HAVE been dubious of this feature, but I'm been slowly reconsidering. @aturon is supposed to be doing some exploration of this whole space and writing up some detailed thoughts. I actually started rebasing @jroesch's dead branch to implement the desired semantics and making some progress there too, but I've been distracted.

One advantage of finishing up the impl is that it would let us experiment with extensions like the one you describe to see how backwards compatible they truly are -- one problem with fallback is that it is not ACTUALLY backwards compatible, because of the possibility of competing incompatible fallbacks.

@nikomatsakis

This comment has been minimized.

Show comment
Hide comment
@nikomatsakis

nikomatsakis Feb 23, 2016

Contributor

That said I still have my doubts :)

Contributor

nikomatsakis commented Feb 23, 2016

That said I still have my doubts :)

@nikomatsakis

This comment has been minimized.

Show comment
Hide comment
@nikomatsakis

nikomatsakis Feb 23, 2016

Contributor

Another example where this could be useful -- basically the same example as petgraph -- is adding allocators to collections in some smooth way.

Contributor

nikomatsakis commented Feb 23, 2016

Another example where this could be useful -- basically the same example as petgraph -- is adding allocators to collections in some smooth way.

@durka

This comment has been minimized.

Show comment
Hide comment
@durka

durka Feb 24, 2016

Contributor

What are the drawbacks to turning this on? It seems to mainly make things compile that otherwise cannot infer enough type information.

Contributor

durka commented Feb 24, 2016

What are the drawbacks to turning this on? It seems to mainly make things compile that otherwise cannot infer enough type information.

@abonander

This comment has been minimized.

Show comment
Hide comment
@abonander

abonander Feb 25, 2016

Contributor

I have a pretty good use for this too. It's basically what @bluss mentioned, adding new types to an impl while avoiding breaking inference on existing usage.

Contributor

abonander commented Feb 25, 2016

I have a pretty good use for this too. It's basically what @bluss mentioned, adding new types to an impl while avoiding breaking inference on existing usage.

@withoutboats

This comment has been minimized.

Show comment
Hide comment
@withoutboats

withoutboats Apr 5, 2016

Contributor

Is the only issue with this the interaction with numeric fallback? I like default type parameters a lot. I often use them when I parameterize a type which has only one production instantiation, for mocking and to enforce bondaries. Its inconsistent and for me unpleasant that defaults don't work for the type parameters of functions.

Contributor

withoutboats commented Apr 5, 2016

Is the only issue with this the interaction with numeric fallback? I like default type parameters a lot. I often use them when I parameterize a type which has only one production instantiation, for mocking and to enforce bondaries. Its inconsistent and for me unpleasant that defaults don't work for the type parameters of functions.

@joshtriplett

This comment has been minimized.

Show comment
Hide comment
@joshtriplett

joshtriplett Apr 6, 2016

Member

I have a use case for this feature as well. Consider the following code:

#![feature(default_type_parameter_fallback)]
use std::path::Path;

fn func<P: AsRef<Path> = String>(p: Option<P>) {
    match p {
        None => { println!("None"); }
        Some(path) => { println!("{:?}", path.as_ref()); }
    }
}

fn main() {
    func(None);
}

Without default_type_parameter_fallback, the call in main would require a type annotation: func(None::<String>);.

Along similar lines, consider a function accepting an IntoIterator<Item=P>, where P: AsRef<Path>. If you want to pass iter::empty(), you have to give it an explicit type. With default_type_parameter_fallback, you can just pass iter::empty().

Member

joshtriplett commented Apr 6, 2016

I have a use case for this feature as well. Consider the following code:

#![feature(default_type_parameter_fallback)]
use std::path::Path;

fn func<P: AsRef<Path> = String>(p: Option<P>) {
    match p {
        None => { println!("None"); }
        Some(path) => { println!("{:?}", path.as_ref()); }
    }
}

fn main() {
    func(None);
}

Without default_type_parameter_fallback, the call in main would require a type annotation: func(None::<String>);.

Along similar lines, consider a function accepting an IntoIterator<Item=P>, where P: AsRef<Path>. If you want to pass iter::empty(), you have to give it an explicit type. With default_type_parameter_fallback, you can just pass iter::empty().

@kennytm

This comment has been minimized.

Show comment
Hide comment
@kennytm

kennytm Apr 26, 2016

Member

Note: In case anyone hit the "type macros are experimental" error in 1.8.0, the type_macros feature is tracked at #27245 (the wrong number has been corrected in 1.9.0 with #32516).

Member

kennytm commented Apr 26, 2016

Note: In case anyone hit the "type macros are experimental" error in 1.8.0, the type_macros feature is tracked at #27245 (the wrong number has been corrected in 1.9.0 with #32516).

@tikue

This comment has been minimized.

Show comment
Hide comment
@tikue

tikue May 29, 2016

Contributor

I've found this feature immensely helpful and would love to see it stabilized.

Contributor

tikue commented May 29, 2016

I've found this feature immensely helpful and would love to see it stabilized.

@Ericson2314

This comment has been minimized.

Show comment
Hide comment
@Ericson2314

Ericson2314 Jun 30, 2016

Contributor

I think this feature is very useful for an ergonomic Read/Write with an associated error type. See QuiltOS/core-io@4296d87 for how this was formerly done.

Contributor

Ericson2314 commented Jun 30, 2016

I think this feature is very useful for an ergonomic Read/Write with an associated error type. See QuiltOS/core-io@4296d87 for how this was formerly done.

@withoutboats

This comment has been minimized.

Show comment
Hide comment
@withoutboats

withoutboats Jun 30, 2016

Contributor

If we had this feature, possibly we could have slice::sort take a type parameter to provide the sorting algorithm.

Contributor

withoutboats commented Jun 30, 2016

If we had this feature, possibly we could have slice::sort take a type parameter to provide the sorting algorithm.

@nikomatsakis

This comment has been minimized.

Show comment
Hide comment
@nikomatsakis

nikomatsakis Jul 1, 2016

Contributor

So @eddyb floated an interesting idea for how to have this feature without the forwards compatibility hazards. I'm not honestly sure how much what has been written about and where -- but in general there is an obvious problem when you introduce "fallback" that it could happen that a given type variable has multiple fallbacks which apply. This means that introducing a new type parameter with a fallback can easily still be a breaking change, no matter what else we do. This is (one of) the reasons that my enthusiasm for this feature has dulled a bit.

You can see this problem immediately when you consider the interaction with the i32 fallback we have for integers. Imagine you have foo(22) where the function foo is defined as fn foo<T>(t: T). Currently T will be i32. But if you change foo to fn foo<T=u32>(t: T), then what should T be? There are now two potentially applicable defaults: i32 and u32.

The idea that @eddyb had was basically to be more conservative around defaults. In particular, the idea was that we would limit defaults to type declarations (iirc) and not to fns. I'm having trouble recalling the precise plan he put forward -- it was quickly and over IRC -- but iirc the idea was that it would be an error if to have a type variable that had no default mix with one that had some other default. All the type variables would have to have the same default.

So e.g. you could add an allocator parameter A to various data-structures like Vec and HashMap:

struct Vec<T, A=GlobalAlocator> { ... }
struct HashMap<K, V, A=GlobalAllocator> { ... }

so long as you are consistent about using the same default for allocators in every other place that you add them, since otherwise you risk having defaults that disagree. Don't have time to do a detailed write-up, and I'm probably getting something a bit wrong. Perhaps @eddyb can explain it better.

Contributor

nikomatsakis commented Jul 1, 2016

So @eddyb floated an interesting idea for how to have this feature without the forwards compatibility hazards. I'm not honestly sure how much what has been written about and where -- but in general there is an obvious problem when you introduce "fallback" that it could happen that a given type variable has multiple fallbacks which apply. This means that introducing a new type parameter with a fallback can easily still be a breaking change, no matter what else we do. This is (one of) the reasons that my enthusiasm for this feature has dulled a bit.

You can see this problem immediately when you consider the interaction with the i32 fallback we have for integers. Imagine you have foo(22) where the function foo is defined as fn foo<T>(t: T). Currently T will be i32. But if you change foo to fn foo<T=u32>(t: T), then what should T be? There are now two potentially applicable defaults: i32 and u32.

The idea that @eddyb had was basically to be more conservative around defaults. In particular, the idea was that we would limit defaults to type declarations (iirc) and not to fns. I'm having trouble recalling the precise plan he put forward -- it was quickly and over IRC -- but iirc the idea was that it would be an error if to have a type variable that had no default mix with one that had some other default. All the type variables would have to have the same default.

So e.g. you could add an allocator parameter A to various data-structures like Vec and HashMap:

struct Vec<T, A=GlobalAlocator> { ... }
struct HashMap<K, V, A=GlobalAllocator> { ... }

so long as you are consistent about using the same default for allocators in every other place that you add them, since otherwise you risk having defaults that disagree. Don't have time to do a detailed write-up, and I'm probably getting something a bit wrong. Perhaps @eddyb can explain it better.

@eddyb

This comment has been minimized.

Show comment
Hide comment
@eddyb

eddyb Jul 1, 2016

Member

The gist of the idea is that before applying defaults, we ensure that everything which could have defaults added in the future was already inferred (i.e. as we error now for unbound inference variables).

So if you had struct Foo<T, A>(Box<T, A>); (and Box had a default for A), let _ = Foo(box x); would always be an error, as the result of inference would change if a default for A was added to Foo.

Your options would be struct Foo<T>(Box<T>); or struct Foo<T, A=GlobalAllocator>(Box<T, A>);: any other default would be useless during inference because it would conflict with Box's default.

This scheme works with both allocators and hashers AFAICT, since you can just use the "global default" everywhere you want to make it configurable, and there are likely more usecases out there like that.

The catch is that you have to limit from the start the possible locations of defaults you'll take into consideration to "complete" type inference, and allowing them on more than type definitions would require a lot of duplication of the same default everywhere, but there may be a reason to do that.

Member

eddyb commented Jul 1, 2016

The gist of the idea is that before applying defaults, we ensure that everything which could have defaults added in the future was already inferred (i.e. as we error now for unbound inference variables).

So if you had struct Foo<T, A>(Box<T, A>); (and Box had a default for A), let _ = Foo(box x); would always be an error, as the result of inference would change if a default for A was added to Foo.

Your options would be struct Foo<T>(Box<T>); or struct Foo<T, A=GlobalAllocator>(Box<T, A>);: any other default would be useless during inference because it would conflict with Box's default.

This scheme works with both allocators and hashers AFAICT, since you can just use the "global default" everywhere you want to make it configurable, and there are likely more usecases out there like that.

The catch is that you have to limit from the start the possible locations of defaults you'll take into consideration to "complete" type inference, and allowing them on more than type definitions would require a lot of duplication of the same default everywhere, but there may be a reason to do that.

@eddyb

This comment has been minimized.

Show comment
Hide comment
@eddyb

eddyb Jul 1, 2016

Member

@nikomatsakis We could "inherit" defaults from Self's type parameters in inherent impls and allow defaults everywhere else, what do you think?
It seems rare that you have fully parametric free fns working with a container type that wants defaults.

We could also "inherit" defaults everywhere, from user type definitions and forbid having your own defaults for type parameters that end up being used in types which could have defaults in the future.

Such an adjustment would make this viable even in @withoutboats' <[T]>::sort situation.

Member

eddyb commented Jul 1, 2016

@nikomatsakis We could "inherit" defaults from Self's type parameters in inherent impls and allow defaults everywhere else, what do you think?
It seems rare that you have fully parametric free fns working with a container type that wants defaults.

We could also "inherit" defaults everywhere, from user type definitions and forbid having your own defaults for type parameters that end up being used in types which could have defaults in the future.

Such an adjustment would make this viable even in @withoutboats' <[T]>::sort situation.

@withoutboats

This comment has been minimized.

Show comment
Hide comment
@withoutboats

withoutboats Jul 1, 2016

Contributor

You can see this problem immediately when you consider the interaction with the i32 fallback we have for integers. Imagine you have foo(22) where the function foo is defined as fn foo<T>(t: T). Currently T will be i32. But if you change foo to fn foo<T=u32>(t: T), then what should T be? There are now two potentially applicable defaults: i32 and u32.

I don't know, it seems to me like adding a default to an existing type parameter just ought to be a breaking change (and it also seems to me that T should be u32)?

Contributor

withoutboats commented Jul 1, 2016

You can see this problem immediately when you consider the interaction with the i32 fallback we have for integers. Imagine you have foo(22) where the function foo is defined as fn foo<T>(t: T). Currently T will be i32. But if you change foo to fn foo<T=u32>(t: T), then what should T be? There are now two potentially applicable defaults: i32 and u32.

I don't know, it seems to me like adding a default to an existing type parameter just ought to be a breaking change (and it also seems to me that T should be u32)?

@nrc

This comment has been minimized.

Show comment
Hide comment
@nrc

nrc Aug 8, 2016

Member

@eddyb to clarify some points about your suggestion:

  • it must be enforced at usage sites, not in definitions? In your example, it is let _ = Foo(box x) that is the problem, specifically if the type of x cannot be fully inferred?
  • do you intend to forbid defaults for functions entirely? Could you give an example of functions with defaults which are more problematic than types?
  • could you explain why using Self's types makes the sort example work?
  • adding a default to an existing type parameter could still be a breaking change?
Member

nrc commented Aug 8, 2016

@eddyb to clarify some points about your suggestion:

  • it must be enforced at usage sites, not in definitions? In your example, it is let _ = Foo(box x) that is the problem, specifically if the type of x cannot be fully inferred?
  • do you intend to forbid defaults for functions entirely? Could you give an example of functions with defaults which are more problematic than types?
  • could you explain why using Self's types makes the sort example work?
  • adding a default to an existing type parameter could still be a breaking change?
@eddyb

This comment has been minimized.

Show comment
Hide comment
@eddyb

eddyb Aug 8, 2016

Member
  • it must be enforced at usage sites, not in definitions? In your example, it is let _ = Foo(box x) that is the problem, specifically if the type of x cannot be fully inferred?

Yes, however, the enforcement on uses is more permissive than today's type-checking, i.e. it would only allow more code to compile. And it's not the type of x, but the fact that box x could have any A, and the lack of a default on Foo represents a hazard.

  • do you intend to forbid defaults for functions entirely? Could you give an example of functions with defaults which are more problematic than types?

It's not functions with defaults, it's functions with any type parameters that would cause problems in such as scheme. Without their own defaults, matching everything else (or automatically deduced), they would prevent any inference variable they come into contact from getting its defaults from anywhere.

  • could you explain why using Self's types makes the sort example work?

Not just Self, but everything. The sort example would be helped by taking into account defaults of type parameters of types used in the signature and the fact that the type parameter would not be used in the signature, e.g.:

impl<T> [T] {
    pub fn sort<A: Algorithm = MergeSort>(&mut self) {...}
}

In such a definition, A can either be explicitly provided or defaulted to MergeSort.
It's very important that A can't be inferred from anywhere else, which results in 0 hazards.

  • adding a default to an existing type parameter could still be a breaking change?

I don't see how. All the code that had compiled without the default succeeded in inference without applying any defaults, so it couldn't ever see the new default.

Member

eddyb commented Aug 8, 2016

  • it must be enforced at usage sites, not in definitions? In your example, it is let _ = Foo(box x) that is the problem, specifically if the type of x cannot be fully inferred?

Yes, however, the enforcement on uses is more permissive than today's type-checking, i.e. it would only allow more code to compile. And it's not the type of x, but the fact that box x could have any A, and the lack of a default on Foo represents a hazard.

  • do you intend to forbid defaults for functions entirely? Could you give an example of functions with defaults which are more problematic than types?

It's not functions with defaults, it's functions with any type parameters that would cause problems in such as scheme. Without their own defaults, matching everything else (or automatically deduced), they would prevent any inference variable they come into contact from getting its defaults from anywhere.

  • could you explain why using Self's types makes the sort example work?

Not just Self, but everything. The sort example would be helped by taking into account defaults of type parameters of types used in the signature and the fact that the type parameter would not be used in the signature, e.g.:

impl<T> [T] {
    pub fn sort<A: Algorithm = MergeSort>(&mut self) {...}
}

In such a definition, A can either be explicitly provided or defaulted to MergeSort.
It's very important that A can't be inferred from anywhere else, which results in 0 hazards.

  • adding a default to an existing type parameter could still be a breaking change?

I don't see how. All the code that had compiled without the default succeeded in inference without applying any defaults, so it couldn't ever see the new default.

@nrc

This comment has been minimized.

Show comment
Hide comment
@nrc

nrc Aug 8, 2016

Member

Thanks for the explanations!

It's not functions with defaults, it's functions with any type parameters that would cause problems in such as scheme. Without their own defaults, matching everything else (or automatically deduced), they would prevent any inference variable they come into contact from getting its defaults from anywhere.

So, we apply the same rules to type parameters on both functions and types?

And to summarise your rule, is it accurate to say "wherever we perform inference, if an inference variable has a default, then it is an error if that variable unifies with any other inference variables, unless they have the same default or the variable also unifies with a concrete type"?

Member

nrc commented Aug 8, 2016

Thanks for the explanations!

It's not functions with defaults, it's functions with any type parameters that would cause problems in such as scheme. Without their own defaults, matching everything else (or automatically deduced), they would prevent any inference variable they come into contact from getting its defaults from anywhere.

So, we apply the same rules to type parameters on both functions and types?

And to summarise your rule, is it accurate to say "wherever we perform inference, if an inference variable has a default, then it is an error if that variable unifies with any other inference variables, unless they have the same default or the variable also unifies with a concrete type"?

@eddyb

This comment has been minimized.

Show comment
Hide comment
@eddyb

eddyb Aug 8, 2016

Member

@nrc Well, we could apply the same rules, we could deny/ignore defaults on functions, or we could make functions go somewhere in between, using defaults that we can gather from their signature, combined with their own defaults, in cases where they would be unambiguous.
I tried to list the options, and I prefer the hybrid rule for functions, but it's just one option of several.

As for your formulation, if you apply that rule after inference stops at a fix-point, it would amount in "for all remaining unassigned inferences variables, first error for any without a default" (because they'd never unify with anything that's not caused by applying a default), "and then apply all defaults at the same time" (with conflicting defaults causing type mismatch errors).
So yes, I believe that is correct, it seems to be equivalent to the algorithm I had in mind.

However, if you try to apply defaults before inference has given up, or your "unifies with a concrete type" can use types that were a side-effect of applying defaults, the compatibility hazards remain.

Member

eddyb commented Aug 8, 2016

@nrc Well, we could apply the same rules, we could deny/ignore defaults on functions, or we could make functions go somewhere in between, using defaults that we can gather from their signature, combined with their own defaults, in cases where they would be unambiguous.
I tried to list the options, and I prefer the hybrid rule for functions, but it's just one option of several.

As for your formulation, if you apply that rule after inference stops at a fix-point, it would amount in "for all remaining unassigned inferences variables, first error for any without a default" (because they'd never unify with anything that's not caused by applying a default), "and then apply all defaults at the same time" (with conflicting defaults causing type mismatch errors).
So yes, I believe that is correct, it seems to be equivalent to the algorithm I had in mind.

However, if you try to apply defaults before inference has given up, or your "unifies with a concrete type" can use types that were a side-effect of applying defaults, the compatibility hazards remain.

@jsen-

This comment has been minimized.

Show comment
Hide comment
@jsen-

jsen- Sep 3, 2016

Contributor

I have this convenient command executor which deals with all the things that might go wrong including checking the exit code.
Obviously in 99% of cases non-zero exit code means an error, but I'd like to cover this 1% without resorting to always requiring some exitcode validator.

I'm reading that default_type_parameter_fallback is not likely to be stabilized, is there some other way to model those 99/1 use cases without resorting to something like fn execute_command_with_custom_exit_code_validator(...)?

pub trait ExitCodeValidator {
    fn validate(exit_code: i32) -> bool;
}

pub struct DefaultExitCodeValidator;

impl ExitCodeValidator for DefaultExitCodeValidator {
    fn validate(exit_code: i32) -> bool {
        return exit_code == 0;
    }
}

pub fn execute_command<T, V=DefaultExitCodeValidator>(cmd: &mut Command) -> Result<T>
where V: ExitCodeValidator,
{
    let output = ...;
    let exit_code = ...;

    if V::validate(exit_code) {
        Ok(output)
    } else {
        Err(InvalidExitCode(exit_code))
    }
}
Contributor

jsen- commented Sep 3, 2016

I have this convenient command executor which deals with all the things that might go wrong including checking the exit code.
Obviously in 99% of cases non-zero exit code means an error, but I'd like to cover this 1% without resorting to always requiring some exitcode validator.

I'm reading that default_type_parameter_fallback is not likely to be stabilized, is there some other way to model those 99/1 use cases without resorting to something like fn execute_command_with_custom_exit_code_validator(...)?

pub trait ExitCodeValidator {
    fn validate(exit_code: i32) -> bool;
}

pub struct DefaultExitCodeValidator;

impl ExitCodeValidator for DefaultExitCodeValidator {
    fn validate(exit_code: i32) -> bool {
        return exit_code == 0;
    }
}

pub fn execute_command<T, V=DefaultExitCodeValidator>(cmd: &mut Command) -> Result<T>
where V: ExitCodeValidator,
{
    let output = ...;
    let exit_code = ...;

    if V::validate(exit_code) {
        Ok(output)
    } else {
        Err(InvalidExitCode(exit_code))
    }
}
@cramertj

This comment has been minimized.

Show comment
Hide comment
@cramertj

cramertj Sep 13, 2016

Member

Correct me if I'm wrong, but the behavior being discussed here seems to be available on both stable and nightly (without a warning). Even though defaulted type parameters for functions are feature-flagged, it's possible to emulate this behavior by making a function impl on a struct with a default type parameter.

This behavior can be seen here. Am I mistaken in thinking that this is exactly the behavior this conversation is intending to prevent? I've changed i8 from a concrete type to a default, and it resulted in a change of the type of the resulting value.

Edit: I was in fact mistaken! I managed to confuse myself-- what's actually happening here is that none of the default type parameters are being applied. The only reason the line with Withi8Default::new compiles is that integers default to i32. In this case, all Withi8Default::new and Withi64Default::new are just the identity function. The default type parameters on the struct only apply when using struct literal syntax, not when calling associated functions (as they should).

Member

cramertj commented Sep 13, 2016

Correct me if I'm wrong, but the behavior being discussed here seems to be available on both stable and nightly (without a warning). Even though defaulted type parameters for functions are feature-flagged, it's possible to emulate this behavior by making a function impl on a struct with a default type parameter.

This behavior can be seen here. Am I mistaken in thinking that this is exactly the behavior this conversation is intending to prevent? I've changed i8 from a concrete type to a default, and it resulted in a change of the type of the resulting value.

Edit: I was in fact mistaken! I managed to confuse myself-- what's actually happening here is that none of the default type parameters are being applied. The only reason the line with Withi8Default::new compiles is that integers default to i32. In this case, all Withi8Default::new and Withi64Default::new are just the identity function. The default type parameters on the struct only apply when using struct literal syntax, not when calling associated functions (as they should).

@jsen-

This comment has been minimized.

Show comment
Hide comment
@jsen-

jsen- Sep 13, 2016

Contributor

I see. I guess this does not clutter the API that much and still monomorphises the call, so you don't have to pay for indirection as with builder pattern alternative. Cool, thanks for pointing this out.

Contributor

jsen- commented Sep 13, 2016

I see. I guess this does not clutter the API that much and still monomorphises the call, so you don't have to pay for indirection as with builder pattern alternative. Cool, thanks for pointing this out.

@cramertj

This comment has been minimized.

Show comment
Hide comment
@cramertj

cramertj Sep 13, 2016

Member

@jsen- I think you misunderstood my point. I had thought I had discovered a bug-- default type parameters shouldn't be usable on functions (currently). However, I was just mistaking numeric defaults for something more sinister.

Member

cramertj commented Sep 13, 2016

@jsen- I think you misunderstood my point. I had thought I had discovered a bug-- default type parameters shouldn't be usable on functions (currently). However, I was just mistaking numeric defaults for something more sinister.

@nikomatsakis

This comment has been minimized.

Show comment
Hide comment
@nikomatsakis

nikomatsakis Sep 13, 2016

Contributor

@jsen- it doesn't directly help you, but it occurs to me that if we had default fn parameters, one could (maybe?) write:

pub trait ExitCodeValidator {
    fn validate(&self, exit_code: i32) -> bool;
}

pub struct DefaultExitCodeValidator;

impl ExitCodeValidator for DefaultExitCodeValidator {
    fn validate(&self, exit_code: i32) -> bool {
        return exit_code == 0;
    }
}

pub fn execute_command<T, V>(
    cmd: &mut Command,
    validator: V = DefaultExitCodeValidator)
    -> Result<T>
where V: ExitCodeValidator,
{
    let output = ...;
    let exit_code = ...;

    if validator.validate(exit_code) {
        Ok(output)
    } else {
        Err(InvalidExitCode(exit_code))
    }
}

This seems better than your existing code because it allows an exit code validator to carry some state. But of course it relies on a future that's not implemented, and in particular on the ability to instantiate a default parameter of generic type.

Contributor

nikomatsakis commented Sep 13, 2016

@jsen- it doesn't directly help you, but it occurs to me that if we had default fn parameters, one could (maybe?) write:

pub trait ExitCodeValidator {
    fn validate(&self, exit_code: i32) -> bool;
}

pub struct DefaultExitCodeValidator;

impl ExitCodeValidator for DefaultExitCodeValidator {
    fn validate(&self, exit_code: i32) -> bool {
        return exit_code == 0;
    }
}

pub fn execute_command<T, V>(
    cmd: &mut Command,
    validator: V = DefaultExitCodeValidator)
    -> Result<T>
where V: ExitCodeValidator,
{
    let output = ...;
    let exit_code = ...;

    if validator.validate(exit_code) {
        Ok(output)
    } else {
        Err(InvalidExitCode(exit_code))
    }
}

This seems better than your existing code because it allows an exit code validator to carry some state. But of course it relies on a future that's not implemented, and in particular on the ability to instantiate a default parameter of generic type.

@jsen-

This comment has been minimized.

Show comment
Hide comment
@jsen-

jsen- Sep 13, 2016

Contributor

@nikomatsakis right, I just shortened my current implementation, which relies on default_type_parameter_fallback.
I'd sure love to see default function arguments, but I'm not even aware of any plans to include them in the language (I'll sure look around in the RFC section 😃)
I posted the question because we need to move to stable some day 😉
Edit: looking again at your example, this time with with both eyes open (it's after midnight here already 😆), you didn't mean default function arguments. But my goal is not specify the Validator at all in the usual case. But I'm probably overthinking it, because I agree with explicit is better than explicit

@cramertj yeah, I was confused. Based on your comment I got an idea, that I could emulate it with struct defaulted type params.
Something like this:

// defaulted:
Exec::execute_command(cmd);
// explicit:
Exec::<MyExitCodeValidator>::execute_command(cmd);

...but I'm finding this is not be possible as well. I'm not yet fully shifted away from C++'s meta-programming. Definitely still need to learn a lot about rust.

Contributor

jsen- commented Sep 13, 2016

@nikomatsakis right, I just shortened my current implementation, which relies on default_type_parameter_fallback.
I'd sure love to see default function arguments, but I'm not even aware of any plans to include them in the language (I'll sure look around in the RFC section 😃)
I posted the question because we need to move to stable some day 😉
Edit: looking again at your example, this time with with both eyes open (it's after midnight here already 😆), you didn't mean default function arguments. But my goal is not specify the Validator at all in the usual case. But I'm probably overthinking it, because I agree with explicit is better than explicit

@cramertj yeah, I was confused. Based on your comment I got an idea, that I could emulate it with struct defaulted type params.
Something like this:

// defaulted:
Exec::execute_command(cmd);
// explicit:
Exec::<MyExitCodeValidator>::execute_command(cmd);

...but I'm finding this is not be possible as well. I'm not yet fully shifted away from C++'s meta-programming. Definitely still need to learn a lot about rust.

@eddyb

This comment has been minimized.

Show comment
Hide comment
@eddyb

eddyb Sep 13, 2016

Member

@jsen- Try <Exec>::execute_command(cmd) (that way the path is in a type context, not expression context, and the defaults are forcefully applied).

Member

eddyb commented Sep 13, 2016

@jsen- Try <Exec>::execute_command(cmd) (that way the path is in a type context, not expression context, and the defaults are forcefully applied).

@jsen-

This comment has been minimized.

Show comment
Hide comment
@jsen-

jsen- Sep 13, 2016

Contributor

@eddyb thanks a lot, that worked perfectly
Here's a link to somewhat contrived, but working example if that helps anyone who finds this

Contributor

jsen- commented Sep 13, 2016

@eddyb thanks a lot, that worked perfectly
Here's a link to somewhat contrived, but working example if that helps anyone who finds this

@nikomatsakis

This comment has been minimized.

Show comment
Hide comment
@nikomatsakis

nikomatsakis Nov 28, 2016

Contributor

adding a link to this (old) internals thread for posterity:

https://internals.rust-lang.org/t/interaction-of-user-defined-and-integral-fallbacks-with-inference/2496

it discusses the interaction of RFC 213 with integral fallback, though similar issues can also arise without using integers, if you just have multiple functions with competing fallback.

Contributor

nikomatsakis commented Nov 28, 2016

adding a link to this (old) internals thread for posterity:

https://internals.rust-lang.org/t/interaction-of-user-defined-and-integral-fallbacks-with-inference/2496

it discusses the interaction of RFC 213 with integral fallback, though similar issues can also arise without using integers, if you just have multiple functions with competing fallback.

bors-servo added a commit to servo/webrender that referenced this issue Apr 10, 2017

Auto merge of #1098 - Gankro:with_hasher, r=glennw
Use shorter synonym for Hashmap::with_hasher(Default::default())

`HashMap::new` *should* have this behaviour, but has been eternally blocked by this lang feature: rust-lang/rust#27336. Specifically `HashMap::new` would fail to infer a hasher if it was ambiguous (most test/example code).

However we sneakily made the Default implementation generic over Hasher, so this works.

<!-- Reviewable:start -->
---
This change is [<img src="https://reviewable.io/review_button.svg" height="34" align="absmiddle" alt="Reviewable"/>](https://reviewable.io/reviews/servo/webrender/1098)
<!-- Reviewable:end -->
@Gankro

This comment has been minimized.

Show comment
Hide comment
@Gankro

Gankro May 11, 2017

Contributor

It's really unfortunate that this is blocking #32838. Is there anything I can do to help push this forward, outside of implementation work?

Contributor

Gankro commented May 11, 2017

It's really unfortunate that this is blocking #32838. Is there anything I can do to help push this forward, outside of implementation work?

@Ericson2314

This comment has been minimized.

Show comment
Hide comment
@Ericson2314

Ericson2314 May 11, 2017

Contributor

I don't think it's blocking that. We can always just newtype all of collections in std, and move hashmap into collections while we are at it.

It's an ugly solution, but only a temporary once. The allocator traits are far enough behind schedule that we should seriously consider it.

Contributor

Ericson2314 commented May 11, 2017

I don't think it's blocking that. We can always just newtype all of collections in std, and move hashmap into collections while we are at it.

It's an ugly solution, but only a temporary once. The allocator traits are far enough behind schedule that we should seriously consider it.

@nikomatsakis

This comment has been minimized.

Show comment
Hide comment
@nikomatsakis

nikomatsakis May 11, 2017

Contributor

@Gankro I think we're still sort of in need of a good survey of the space of ideas and a nice summary. That would be helpful.

Contributor

nikomatsakis commented May 11, 2017

@Gankro I think we're still sort of in need of a good survey of the space of ideas and a nice summary. That would be helpful.

@mitsuhiko

This comment has been minimized.

Show comment
Hide comment
@mitsuhiko

mitsuhiko Sep 23, 2017

Contributor

I have a few more cases where this is really useful. In particular I have a situation where I have a generic function that requires type inference for the return value. However if there is no user of the return value it requires annotations even though I would have perfectly sensible defaults that then get optimized away.

Think of it like this:

// currently required
let _: () = my_generic_function()?;

// with a unit default
my_generic_function()?;
Contributor

mitsuhiko commented Sep 23, 2017

I have a few more cases where this is really useful. In particular I have a situation where I have a generic function that requires type inference for the return value. However if there is no user of the return value it requires annotations even though I would have perfectly sensible defaults that then get optimized away.

Think of it like this:

// currently required
let _: () = my_generic_function()?;

// with a unit default
my_generic_function()?;
@leodasvacas

This comment has been minimized.

Show comment
Hide comment
@leodasvacas

leodasvacas Nov 13, 2017

Contributor

I'm no expert but I'll try to summarize and contribute.

We want to use type defaults to inform inference. However we may get multiple defaults that could apply to an inference variable, resulting in a conflict. Also new defaults may be added in the future, causing conflicts where there was none.

@eddyb proposes the conservative approach of erroing on conflict and erroing when trying to apply a default to a type with no default, for future-proofing. The consequence is that all defaults must be present and must agree to be applied.

Let's take the reasonable and useful example by @joshtriplett. It would not work because it's trying to unify T in struct Option<T> with P in fn func<P: AsRef<Path> = String>(p: Option<P>). Even if Option<T> were to gain a default for T, intuitively func should not care because it's default is more local and should take precedence.

So I propose that local defaults trump type defaults. So fn and impl defaults would take precedence over the default on the type, meaning that any future conflict between T and P above would always be solved by using the default on P, making it future-proof to apply P's default which is String.

Note that this is limited to literals, if the value came from another fn as in:

fn noner<T>() -> Option<T> { None }

fn main() {
    // func is same as in original example.
    func(noner());
}

Then we are back to an error.

This a small yet useful extension, together with the slice::sort example this makes a case for allowing type defaults in impls and fns. Hopefully we are coming to a minimal but useful version of this feature that we could stabilize.

Edit: If we give fns and impls preference on the defaults, then we should not inherit the default from the type as that would defeat the point of adding the default being non-breaking.

Edit 2:

no universal priority ordering can really exist

Given a set of type variables in context that could be unified by a default, can't we order those variables by the order they are introduced? And then consider them in order, trying their defaults and failing at the first variable with no default.

Contributor

leodasvacas commented Nov 13, 2017

I'm no expert but I'll try to summarize and contribute.

We want to use type defaults to inform inference. However we may get multiple defaults that could apply to an inference variable, resulting in a conflict. Also new defaults may be added in the future, causing conflicts where there was none.

@eddyb proposes the conservative approach of erroing on conflict and erroing when trying to apply a default to a type with no default, for future-proofing. The consequence is that all defaults must be present and must agree to be applied.

Let's take the reasonable and useful example by @joshtriplett. It would not work because it's trying to unify T in struct Option<T> with P in fn func<P: AsRef<Path> = String>(p: Option<P>). Even if Option<T> were to gain a default for T, intuitively func should not care because it's default is more local and should take precedence.

So I propose that local defaults trump type defaults. So fn and impl defaults would take precedence over the default on the type, meaning that any future conflict between T and P above would always be solved by using the default on P, making it future-proof to apply P's default which is String.

Note that this is limited to literals, if the value came from another fn as in:

fn noner<T>() -> Option<T> { None }

fn main() {
    // func is same as in original example.
    func(noner());
}

Then we are back to an error.

This a small yet useful extension, together with the slice::sort example this makes a case for allowing type defaults in impls and fns. Hopefully we are coming to a minimal but useful version of this feature that we could stabilize.

Edit: If we give fns and impls preference on the defaults, then we should not inherit the default from the type as that would defeat the point of adding the default being non-breaking.

Edit 2:

no universal priority ordering can really exist

Given a set of type variables in context that could be unified by a default, can't we order those variables by the order they are introduced? And then consider them in order, trying their defaults and failing at the first variable with no default.

@DoumanAsh

This comment has been minimized.

Show comment
Hide comment
@DoumanAsh

DoumanAsh Jul 17, 2018

Found this issue when I was looking into RFC.

I noticed recently a problem for myself when I have default type paramer, but my impl is generic.
So currently I have two options:

  • impl for struct with default type parameter
  • Ask users to use a bit ugly <Struct> syntax to guarantee default type paramer.

I wonder if it would be sensible to allow impl blocks to have default type parameters while retaining ability to specify trait boundry.

e.g.

pub trait MyTrait {}

struct MyStruct;
impl MyTrait for MyStruct;

struct API<M: MyStruct> {
....
}

impl<M: MyTrait=MyStruct> for API<M> {
    pub fn new() -> Self<M> {
    }
}

I may understand reasoning for impl blocks to overshadow default parameter when invoking API::new() but it is a bit sad when you cannot clearly set default type parameter for impl block alongside your struct.

DoumanAsh commented Jul 17, 2018

Found this issue when I was looking into RFC.

I noticed recently a problem for myself when I have default type paramer, but my impl is generic.
So currently I have two options:

  • impl for struct with default type parameter
  • Ask users to use a bit ugly <Struct> syntax to guarantee default type paramer.

I wonder if it would be sensible to allow impl blocks to have default type parameters while retaining ability to specify trait boundry.

e.g.

pub trait MyTrait {}

struct MyStruct;
impl MyTrait for MyStruct;

struct API<M: MyStruct> {
....
}

impl<M: MyTrait=MyStruct> for API<M> {
    pub fn new() -> Self<M> {
    }
}

I may understand reasoning for impl blocks to overshadow default parameter when invoking API::new() but it is a bit sad when you cannot clearly set default type parameter for impl block alongside your struct.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment